Advertisement

What's up with NVIDIA lately?

Started by November 21, 2010 07:21 PM
30 comments, last by zedz 14 years, 3 months ago
Quote:
Original post by Antheus
Quote:
Original post by ChaosEngine
I have to say, I don't get the concept of being "loyal" to something like a graphics card. Unless you are specifically targeting the capabilities of one line (seems like a foolish thing to do), the choice of ATI or NVIDIA is purely dependant on performance/reliability.
Loyalty is crucial precisely because there is no difference. At the end of the day, individual titles will have unique performance which is mostly unrelated to card itself, but a combination of many factors.

And while statistically one card will outperform the other, individual user only cares about one single title.

Brand is a big deal.

The other side is price ranges. Brand loyalty may push sales in individual ranges higher while acquiring new customers in low and budget price ranges. Acquiring loyalty of those means they'll upgrade.

There is brand identity. When Title X comes up with nVidia logo, it offers some comfort/recognition/... to those loyal to brand.

Finally, there is brand goodwill. Nobody has ever spoken against Google. Everyone hates Microsoft. If your brand gets on good side, you get positive bias. Not only from media (which can be bought), but also increasingly important viral branding - bloggers, enthusiasts, the geek in a group of friends people call when buying new machines.

Brand is all that still matters. Technology, reliability, performance, ... None of majority markets can evaluate those objectively.

Here is an example of brand strength: "Macs do not crash and they never did". When did you last encounter people bitching over Macs crashing. "Oh, the bomb? It's cute" What do numbers say, statistically, which crashes more often - Windows or Mac?




Mac vs. Windows is a totally different issue. My day to day computing experience (and by extension my life since I spend such a large % of it on a computer) will be different depending on OS. "Loyalty" here makes sense. I have an investment in one or the other and there is a significant cost (both monetary and time) to switching. With a graphics card, I can pop in a new card and all that should change is I get a better framerate (regardless of manufacturer).

I was speaking from the end users POV. Obviously the companies want to have brand loyalty, but that's just marketing blurb. My point is that we (as consumers, but especially as developers) should be objective about such things.
if you think programming is like sex, you probably haven't done much of either.-------------- - capn_midnight
Quote:
Original post by ChaosEngine
Mac vs. Windows is a totally different issue. My day to day computing experience (and by extension my life since I spend such a large % of it on a computer) will be different depending on OS. "Loyalty" here makes sense.

Technically speaking, that's not "loyalty". It's "investment".
I believe when Antheus spoke about mac vs windows, he was referring to people having a tendency, when one of those 2 companies make a mistake, to think "Oh, Microsoft, they did it again!!", and with Apple, the tendency is to ignore it, act as it didn't happen, "Jobs has it's reasons" or "well, not everyone is perfect" regardless of how silly MS' mistake was or how big Apple's mistake was.
There's a prejudgment installed in people based on the past, related with brand loyalty; which is hard to revert, even if one tries to stay objective or the company has significantly changed since then.

Quote:
Original post by ChaosEngine
With a graphics card, I can pop in a new card and all that should change is I get a better framerate (regardless of manufacturer).

Framerate, price, heat, and power consumption, in that order, are 4 of the 6 relevant factors I use to decide. But the 2 major ones for me, as a developer, are the dev tools and to a lesser extend, market penetration (I want to use what my costumers will be using to run the game).

But things like framerate is subjective. It's only objective if a given GPU outperforms the rest of graphic cards out there in each and every aspect, but when game A runs slower, and game B runs faster. On average which one is better? Averaging framerates is useless (well, in milliseconds), because we have to analyze why they run slower/faster (is it due to higher memory bandwidth? more ALUs?), and if my main or future applications will benefit from that advantage.
A placebo effect may influence here ("nvidia is better so, on average I should get faster speeds"). And brand loyalty comes into play.

Quote:
Original post by ChaosEngineI was speaking from the end users POV. Obviously the companies want to have brand loyalty, but that's just marketing blurb.

Pure extreme fanatism is bad. But one can be loyal to a brand if they've given you reasons: great warranty experiences, good customer support, stable drivers, other goodies, etc. Being loyal in this sense means I will buy the next card because my last experience(s) was great, I want to keep that experience and I can't be sure whether the competitor which offers a seemingly better product will be there for me when something goes wrong, or will care for me 16 months after I bought it. Furthermore, it's a way of supporting the company which cared for you in your last buy.

NVIDIA used to excel at performance, quality, driver stability, and dev tools. And their competitors had nothing of that.
Today, the only remaining thing is performance, and not by a big margin. Plus, the competitors are getting really good.
Advertisement
Quote:
Original post by Matias Goldberg
There's a prejudgment installed in people based on the past, related with brand loyalty; which is hard to revert, even if one tries to stay objective or the company has significantly changed since then.

Here is a nice casual video on the topic .
Loyalty is because people are sentimental.

It is what it is.
Quote:
Original post by Antheus
Quote:
Original post by Matias Goldberg
There's a prejudgment installed in people based on the past, related with brand loyalty; which is hard to revert, even if one tries to stay objective or the company has significantly changed since then.

Here is a nice casual video on the topic .


Thanks for posting that! Very informative!

No, I am not a professional programmer. I'm just a hobbyist having fun...

Another -1 for NV.
The "new" Fermi looks very much like FX to me. It is expensive and doesn't deliver. At least they seemed to learn something from FX. They probably won't this time.

Previously "Krohm"

Advertisement
I've been another long term NVIDIA user, from GeForce 256 days, but in my case it's been entirely down to their support for Linux.

I also like quiet PCs, I'm looking for a passively cooled alternative to my current GeForce 8600GTS.

AMD seem to have got much better at this Linux drivers recently, has anybody got any practical experience of their Linux OpenGL support. I'm looking for a card that will support OpenGL 4.1 stuff.

Quote:
Original post by dave j
I've been another long term NVIDIA user, from GeForce 256 days, but in my case it's been entirely down to their support for Linux.

I also like quiet PCs, I'm looking for a passively cooled alternative to my current GeForce 8600GTS.

AMD seem to have got much better at this Linux drivers recently, has anybody got any practical experience of their Linux OpenGL support. I'm looking for a card that will support OpenGL 4.1 stuff.


Their new cards is well supported, the only real drawback with AMD today is that they drop cards from their proprietary drivers extremely early forcing you to either upgrade your hardware or stick with an old kernel/x.org version until the OSS driver catches up enough to make the hardware usable again)
[size="1"]I don't suffer from insanity, I'm enjoying every minute of it.
The voices in my head may not be real, but they have some good ideas!
Quote:
Original post by dave j
I've been another long term NVIDIA user, from GeForce 256 days, but in my case it's been entirely down to their support for Linux.

I also like quiet PCs, I'm looking for a passively cooled alternative to my current GeForce 8600GTS.

AMD seem to have got much better at this Linux drivers recently, has anybody got any practical experience of their Linux OpenGL support. I'm looking for a card that will support OpenGL 4.1 stuff.


Not first-hand experience, but AMD have just released their open-source driver for Fusion, well ahead of general hardware availability. If you're unaware, Fusion is their combination of CPU and GPU (they call it an APU, or Application Processing Unit, now, to stress its GPGPU uses) on a single die. The first product will be a dual-core Atom/ULV competitor called Bobcat which also contains 80 APU shader cores -- which is basically a dedicated low-end graphics card, save the dedicated RAM and bandwidth. They're quite current stuff too, 6x00-level stuff IIRC, but at least 5x00 for sure. Benchmarks show it as competitive with current Pentium/Celeron/Core 2 ULV processors on the CPU front, solidly outperforming Atom at all but a few tasks that work particularly well with Intel's hyper threading. They also solidly best nVidia's IGPs ION and ION2. It does both while consuming just ~13 watts, considerably lower than even the most miserly Atom+ION setup.

They'll come out with higher-performing desktop parts with performance CPU cores next year, and I imagine they'll bump up the shader count and shader clocks to increase performance. In Bobcat's target market I imagine the shader count and clock, while quite capable as-is, is held back by more by power and thermal targets (and the fact that faster shaders would probably tend to saturate the shared memory bus) rather than the capability of the silicon itself.

It'd be really cool if they integrated some EDRAM on-die (or on-chip) like they do in the Xbox 360 to give the GPU some dedicated framebuffer memory and bandwidth.

In any event, its a really good sign AMD is early to the game with open-source driver support (for both Mesa and Gallium) for what is a major, new product for them. The major distros are talking about in-built support a few months into the new year. Ubuntu 11.04 (Natty Narwhal) was cited as one such distro, its due April of next year.

throw table_exception("(? ???)? ? ???");

Where are all the cheap ATI GL 4 cards though. I was about to go ATI as well.

NBA2K, Madden, Maneater, Killing Floor, Sims

This topic is closed to new replies.

Advertisement