Advertisement

Should I support D3D9/OpenGL 2.x hardware, or not?

Started by December 19, 2014 05:12 AM
24 comments, last by blueshogun96 10 years ago

I'm really curious as to what you all would say about this, because I'm beginning to reconsider....

Right now, my engine only supports Direct3D10/OpenGL 3.x level hardware and above. From the beginning of my engine's creation, I wanted to keep it that way. I've looked at the system requirements from numerous games on Steam, and noticed that many of them still support hardware from previous generations. AFAIK, most gamers already have support for D3D10 or OpenGL 3 already, so it seems like a waste (kinda like supporting OpenGL ES 1.1 when the vast majority of mobile devices support ES 2.0). I could be totally wrong, and if so, adding OpenGL 2.1 support won't really be a hard or take long at all.

So, what do you think? I tried googling this and searching for other people's opinions, but I guess I didn't use the right keywords, because I could never find anything from a dev's perspective.

Shogun.

More support means more customers. You just have to determine if the additional profit outweighs the cost of the extra dev time.

void hurrrrrrrr() {__asm sub [ebp+4],5;}

There are ten kinds of people in this world: those who understand binary and those who don't.
Advertisement
Define your target market, the answer will follow.

How far back do you have to go to find PCs that were sold without D3D10 supporting hardware, given even integrated Intel chips of 2-3 generations ago support it (albeit they're pretty crap)?

Will your game actually run on PCs that old and slow anyway?

Define your target market, the answer will follow.

Totaly this, but, if you do not need to give up any feature, go as low as it gets.

I myself find d3d10+ a trappy library pointing you to tehcniques that usualy (nearly all of them) seriously damage performance - if you actualy use them.

The only thing I may need to drop dx9 for in the future is the shader model 3.0 limit. I have not needed higher shader model, I instead reformed my rendering to be more sane instead (no 25 interpolated atrributes for pixel function and such).

For me, sticking to standard routines and constructing with them, seems to make me create faster software, even if I need to perform redundant draws, targets, than issuing with some "amazing" instruction of dx11. I do not sample texture in vertex function, I do not render to vertex buffers, I do not alter device coordinate of drawn pixel (I instead do a different trick for that), I can continue for so long. In my experience, if you turn on in some games dx9 to dx10, you notice minimal graphic advance with enormous performance drop. (examples are Stalker for example)

And finally, a rendering engine is just more attractive if it can run its demos on dx9/ogl2.0 and have them eye candy.

In the end, I was expecting that in dx11 one will be able to define rasterization rules (not at all), have in pixel function read pixel properties against them (not at all), instead dx11 added a fancy AAA antialiasing (what would be amazingly doable if upper given was implemented). You cannot use gpu for wider computation. For me dx10/dx11 was big dissapointment as to what I had been hoping for

You will be cutting off a ***LOT*** of folks if you just concentrate on "current generation".

Believe it or not, only the more 'wealthy' folks upgrade their PCs every year.

Heck - there are still lower end PCs being sold today that only support the earlier generation graphics standards .

If you are making a 2D ( or simplistic 3D ) game, don't be an ### ( like a couple game companies I have seen ), and require OpenGL4 / Directx 11 .

I cannot remember the books I've read any more than the meals I have eaten; even so, they have made me.

~ Ralph Waldo Emerson

Advertisement

http://store.steampowered.com/hwsurvey

This can also be used to help gauge your market/target audience.

Hello to all my stalkers.

Blender uses OpenGL 1.4, the earliest version to support shaders. If you want to support 2007 computers.
Right now, my engine only supports Direct3D10/OpenGL 3.x level hardware and above.

If you like macs (and I know you do), go 3.2+, otherwise I'd leave it at 3.3+.

With 3.3 in theory is supported by all D3D10 level hardware (cept for Intel on Windows because reasons) and it has a few nice things (samplers, explicit attribute location, etc), and you pretty much guarantee that if a card can run OGL 3.3 its either new hardware or old hardware with new drivers (new drivers == good thing, specially if doing OGL).

With 3.2 you give up a few things, but you gain older OSX support.

By using OGL 2.x you reintroduce lots of legacy crap into your render code, I don't think its worth the effort. You'd be basically trying to support either people that don't upgrade their drivers, but you're using OGL so you can't support those anyway, or people with Ge Force 7xxx and ATI Radeon X1xxx cards, which I don't think they're enough around to justify it.

Seriously, D3D10 level hardware was released 8 years ago. Let D3D9 go man, if you truly love it, let it go.

EDIT, have a look at this chart , as you can see (R600 is the code name for ATi HD 2xxx series, G8x is for nVidia GeForce 8xxx series), everything (that matters) supports OGL 3.3 except for Sandy Bridge (aka, Intel HD2xxx and HD3xxx), and thats on Windows, Intel HD3xxx on Linux does supports OGL 3.3.

"I AM ZE EMPRAH OPENGL 3.3 THE CORE, I DEMAND FROM THEE ZE SHADERZ AND MATRIXEZ"

My journals: dustArtemis ECS framework and Making a Terrain Generator

Just use GLEW so there's no difference between 2 and 3. What features do you want anyway? Deferred rendering?

This topic is closed to new replies.

Advertisement