Advertisement

Intel HD graphics in CPU

Started by January 12, 2012 02:01 PM
31 comments, last by swiftcoder 12 years, 9 months ago

Getting back on topic I think this reflects my opinion on integrating graphics into the cpu:
NVIDIA (myself included) is unimpressed, saying that “Sandy Bridge” is a “turboprop in an age of jet engines.”

Is NVIDIA unaware that turboprops are still more efficient in many commuter flights, exhibition flying, and military applications and is still widely used? I'm not sure if they intended for that in their metaphor, but it irks me when people don't realize the actuality behind their metaphors.
Lets go against the flow: the latest intel graphics are a big deal. Why?

Because while they might not be top of the line, they are for the first time in history, not a joke. They allow people like me to play their minecraft and civilization, without shelling out for a discrete GPU. Why do you care? Well, casual gamers such as myself are no longer subsidizing the dedicated graphics industry. And people like me are a huge chunk of the market. Which means that, all else being equal, dedicated graphics development will slow down, and those insisting on having the lastest and greatest in terms of graphics, are going to have to pay a higher price for it.

I dont see that with glee, because I love graphics technology and cheap HPC. But it might help overturn the games' industry obsession with doing little but cram the latest graphics hardware to the fullest, other aspects be damned.
Advertisement

Not much. It may be able to assist on CUDA tasks, but I don't know if the drivers actually enable that.
[/quote]
No, when an external graphics driver like NVIDIA is installed, it disables the onboard graphics leaving only the partition.

As for the OP, the intel HD graphics is a big joke. Intel HD graphics sucks up RAM for its video memory which is only used for video editing and watching videos in HD and such, clearly its not worth for high end gaming. It was made to support the processor and for common purposes.

Lets go against the flow: the latest intel graphics are a big deal. Why?

Because while they might not be top of the line, they are for the first time in history, not a joke. They allow people like me to play their minecraft and civilization, without shelling out for a discrete GPU. Why do you care? Well, casual gamers such as myself are no longer subsidizing the dedicated graphics industry. And people like me are a huge chunk of the market. Which means that, all else being equal, dedicated graphics development will slow down, and those insisting on having the lastest and greatest in terms of graphics, are going to have to pay a higher price for it.


I agree. I'm kind of thrown off by people dismissing it. Maybe they all have enthusiast PCs, but for the past 5ish years I've had a laptop and a laptop only. Having a laptop pretty much disqualified me from being able to play PC games because even relatively crappy PC games try to push hardware further than is necessary even on min-specs; a good example is that Magicka doesn't work on most laptops despite being fairly simple. I did end up playing League of Legends and WoW just because they ran on my laptop (albeit on min settings).

Casual users are turning more and more to laptop desktop replacements, and the new intel chips at least make it a market that's accessible to developers rather than one that has to be ignored because it's just not compatible with anything.

edit: even for enthusiast PCs, using a sandybridge processor with virtu can cut your power consumption significantly when you're doing trivial stuff.

Also todays intel onboard gfx are prolly as good as yesterdards nvidia/amd cards

my onboard intel GFX, certainly feels as fast as my last GFX card (nvidia GF - 9500 IIRC)
Tom's Hardware have a useful chart as a guideline is http://www.tomshardware.com/reviews/gaming-graphics-card-review,3107-7.html .

Obviously yes, if you go far back enough, today's highest end on-board Intel will equal even a high end NVIDIA/AMD of the past. But you do have to go back a long way (their chart suggests the highest end Intel HD 3000 is behind an NVIDIA 9500, more like an NVIDIA 9400; or a 6800 to compare against NVIDIA's highest end).

http://erebusrpg.sourceforge.net/ - Erebus, Open Source RPG for Windows/Linux/Android
http://conquests.sourceforge.net/ - Conquests, Open Source Civ-like Game for Windows/Linux



Yeah for games there is little difference now with the latest i7 and i5's it seems:
Our tests demonstrate fairly little difference between a $225 LGA 1155 Core i5-2500K and a $1000 LGA 2011 Core i7-3960X,


Apples to apples. 3960 is next iteration, even uses different socket. Not much point in comparing 386 with 486.
Advertisement
I'm in full agreement with Eelco.

They've been on mainstream chips since around 2004 with DirectX 9 support and full shader support, and have basically kept up with technology -- a partial round behind. Current devices support DX10 and 4.0 shaders, which is rather impressive considering the ubiquity.

It has enabled non-gamers to play high end games without shelling out the premium costs.


They obviously cannot play this year's blockbusters with the graphics turned up to "ultra high", but they can certainly play them. The interesting thing is that yester-year's titles can be played with graphics turned up to 11, which is something many laptops and lower-spec computers simply could not do a decade ago; they still required a dedicated board to play earlier 3D titles.
The consoles seem to help here somewhat. Everything that is also released for consoles seems to have the almost exact same requirements (recommended gtx8800, minimum gtx6800) which the HD3000 just about manages.

The consoles seem to help here somewhat. Everything that is also released for consoles seems to have the almost exact same requirements (recommended gtx8800, minimum gtx6800) which the HD3000 just about manages.

That's a good point. Most games are targeted at the capability of an XBox 360 - and even the HD3000 should be as capable as a 7 year-old console GPU which was fairly anaemic even at the time of release...

Tristam MacDonald. Ex-BigTech Software Engineer. Future farmer. [https://trist.am]


No, when an external graphics driver like NVIDIA is installed, it disables the onboard graphics leaving only the partition.


I have an AsRock Z68 Extreme 4 Jedi Knight 3 Dark Forces 2...

I finally got off my lazy backside and put a 3rd monitor on. At first I tried the HDMI port of my 6850 but googling (and new catalyst drivers) reveal this is not possible, so I plugged the 3rd DVI into the onboard. I have 3 monitors working now, so I'm not sure that statement is entirely accurate. At one point it certainly was.

This topic is closed to new replies.

Advertisement