Advertisement

Should I support D3D9/OpenGL 2.x hardware, or not?

Started by December 19, 2014 05:12 AM
24 comments, last by blueshogun96 10 years ago

Well, I think it's safe to say that Frob nailed it with one statement; a statement that should run though the minds of any business oriented person. I honestly think it says more than what everyone else can say in one paragraph or more.

Define your target market, the answer will follow.

Totaly this, but, if you do not need to give up any feature, go as low as it gets.

I myself find d3d10+ a trappy library pointing you to tehcniques that usualy (nearly all of them) seriously damage performance - if you actualy use them.

The only thing I may need to drop dx9 for in the future is the shader model 3.0 limit. I have not needed higher shader model, I instead reformed my rendering to be more sane instead (no 25 interpolated atrributes for pixel function and such).

For me, sticking to standard routines and constructing with them, seems to make me create faster software, even if I need to perform redundant draws, targets, than issuing with some "amazing" instruction of dx11. I do not sample texture in vertex function, I do not render to vertex buffers, I do not alter device coordinate of drawn pixel (I instead do a different trick for that), I can continue for so long. In my experience, if you turn on in some games dx9 to dx10, you notice minimal graphic advance with enormous performance drop. (examples are Stalker for example)

And finally, a rendering engine is just more attractive if it can run its demos on dx9/ogl2.0 and have them eye candy.

In the end, I was expecting that in dx11 one will be able to define rasterization rules (not at all), have in pixel function read pixel properties against them (not at all), instead dx11 added a fancy AAA antialiasing (what would be amazingly doable if upper given was implemented). You cannot use gpu for wider computation. For me dx10/dx11 was big dissapointment as to what I had been hoping for

Although I also found D3D10/11 to be disappointing in some areas, I'm actually beginning to like certain things better than D3D9. As much as I don't like the API design (most likely because I don't understand the design choices), it's in "the now", and that's what I need to be targeting. I've been targeting my rendering engine to tailor to D3D and core OpenGL to minimize performance loss. I can choose between any supported API just by setting a single flag.

The last thing I want to do is hold my engine back with legacy stuff, because that's just more maintenance than I believe it's worth. That's especially true if the target demographic has a machine that can handle a more modern API.

http://store.steampowered.com/hwsurvey

This can also be used to help gauge your market/target audience.

I saw this, but if there's one thing I'm certain of is that Steam doesn't accurately represent the PC/Mac gaming scene as a whole. Casual gamers (i.e. Angry Birds, Flappy Bird, Plants vs. Zombies, etc) are less likely to have top of the line hardware to use. Heck, there are people with 5+ year old laptops running some crappy Intel GPU. Worst case scenario, there are some people who don't have their own computers, but will use publicly available ones (which often are bought used, and use low end hardware to begin with). My latest title (which isn't using the engine I've written because this game pre-dates it by over a year) intentionally uses OpenGL 2.0 because it's a casual game that has a wide target market/demographic.

This is good information though. I'd like to release my game on Steam, but that's just one of many channels of distribution.

You will be cutting off a ***LOT*** of folks if you just concentrate on "current generation".

Believe it or not, only the more 'wealthy' folks upgrade their PCs every year.

Heck - there are still lower end PCs being sold today that only support the earlier generation graphics standards .

If you are making a 2D ( or simplistic 3D ) game, don't be an ### ( like a couple game companies I have seen ), and require OpenGL4 / Directx 11 .

I dunno about this. As Frob stated, consider who you're targeting. For a casual game, probably. But for a more extravagant title, I think supporting legacy hardware will only hold the game back (and if the person buying a computer wants to play extravagant titles, they should know better).

But considering that D3D11 allows you to run on D3D9 level hardware, there isn't much loss there (unless you're targeting WinXP). If D3D11 level hardware for a visually lower scale game is required, that's ridiculous.

Lastly, I personally wouldn't consider D3D10 or OpenGL 3 hardware current generation either.

Right now, my engine only supports Direct3D10/OpenGL 3.x level hardware and above.

If you like macs (and I know you do), go 3.2+, otherwise I'd leave it at 3.3+.

With 3.3 in theory is supported by all D3D10 level hardware (cept for Intel on Windows because reasons) and it has a few nice things (samplers, explicit attribute location, etc), and you pretty much guarantee that if a card can run OGL 3.3 its either new hardware or old hardware with new drivers (new drivers == good thing, specially if doing OGL).

With 3.2 you give up a few things, but you gain older OSX support.

By using OGL 2.x you reintroduce lots of legacy crap into your render code, I don't think its worth the effort. You'd be basically trying to support either people that don't upgrade their drivers, but you're using OGL so you can't support those anyway, or people with Ge Force 7xxx and ATI Radeon X1xxx cards, which I don't think they're enough around to justify it.

Seriously, D3D10 level hardware was released 8 years ago. Let D3D9 go man, if you truly love it, let it go.

EDIT, have a look at this chart , as you can see (R600 is the code name for ATi HD 2xxx series, G8x is for nVidia GeForce 8xxx series), everything (that matters) supports OGL 3.3 except for Sandy Bridge (aka, Intel HD2xxx and HD3xxx), and thats on Windows, Intel HD3xxx on Linux does supports OGL 3.3.

My engine is capable of determining the highest possible core OpenGL context version. It will automatically choose the best profile for the version selected.

Well, there has to be a cut off point somewhere. Example, devs could only support 3Dfx for so long after it went belly up. I'm not worried about those who are still zealously hanging on to their Hercules Kyro II.

Another thing, I don't (and never have planned to) literally support D3D9. I can't afford to have my engine held back by legacy APIs. Like UE, any serious engine has to evolve with the times. If I could, I'd leave OpenGL ES 2.0 in the dust, but unfortunately, ES 3.0 devices haven't taken over as the predominate like ES 2.0 devices have. I don't believe for a moment that there's any engines that support OpenGL ES 1.1 either (at least, no serious ones).

Just use GLEW so there's no difference between 2 and 3. What features do you want anyway? Deferred rendering?

Already using that. As for features, I want every and any advanced feature I can support.

Shogun.

imho yes, even ogl 1.x.

Advertisement

imho yes, even ogl 1.x.

No.

It's time for Old Yeller to be taken out back.

Direct3D has need of instancing, but we do not. We have plenty of glVertexAttrib calls.

imho yes, even ogl 1.x.


No.

It's time for Old Yeller to be taken out back.

I disagree. Every standard should be supported from now until the end of time. Why stop at ogl1?
There might be someone out there with a 486 and an EGA chipset... why shouldn't they play the game too? tongue.png
if you think programming is like sex, you probably haven't done much of either.-------------- - capn_midnight

according to my own investigation, there is still a half billion notebook/netbook with intel gma 9xx is out there. that only supports opengl 1.x. who not supports it, losts 20-40% of his revenue. of course if somebody does not want to understand business, or does not want to sell his games, well then he even can use opengl 4, closing himself out from the 99,9% of the market. thats how the real world works outside the dreamworld, folks. (yes, my engine is crashing on most intel igp-s too... however, at least, i tried to fix it several times)

Dude, Unreal Engine targets D3D9 hardware and above, CryEngine the same, Unity the same, Source the same, Frostbite is D3D11 only IIRC, and I'm pretty sure they do want to sell their games and engines, and do understand bussiness given Epic has been in it from the beginning.

And to be honest, if you can choose OpenGL 1.1 as a realistic target, you're not doing anything complex graphics wise to begin with. You could probably use a software renderer if you wanted to.

"I AM ZE EMPRAH OPENGL 3.3 THE CORE, I DEMAND FROM THEE ZE SHADERZ AND MATRIXEZ"

My journals: dustArtemis ECS framework and Making a Terrain Generator

Advertisement

New games arent supporting d3d10 cards anymore... T_T my pc got old

(at least Alien Isolation and Shadow of Mordor are the ones Im aware that are d3d11 only..)


according to my own investigation, there is still a half billion notebook/netbook with intel gma 9xx is out there. that only supports opengl 1.x.

I suppose that my question is whether those laptops are used for gaming, and if they are, whether they're used for the sort of gaming that the developer in question has in mind. If, for example, I were working on a fast-paced first-person-shooter, do the users of those devices tend to play such games? My guess (And I do stand to be corrected on this) is that users of such devices more likely either don't use them for gaming, or play casual games in spare time, and would likely not be in the audience for a first-person-shooter, or an in-depth RPG, etc.

MWAHAHAHAHAHAHA!!!

My Twitter Account: @EbornIan

according to my own investigation, there is still a half billion notebook/netbook with intel gma 9xx is out there. that only supports opengl 1.x. who not supports it, losts 20-40% of his revenue. of course if somebody does not want to understand business, or does not want to sell his games, well then he even can use opengl 4, closing himself out from the 99,9% of the market. thats how the real world works outside the dreamworld, folks. (yes, my engine is crashing on most intel igp-s too... however, at least, i tried to fix it several times)

No, that's not the way things work.

If you don't support GL 1.x you don't suddenly and automatically lose 20% to 40% of your revenue, because those Intel equipped notebooks may not have even been potential customers. That's why we always say "research your target audience" in response to questions like this. If somebody was never going to buy your game in the first place, they're not a potential customer and they're not a lost sale.

You're also claiming that the choice is between supporting GL 1.x on the one hand, versus only supporting GL 4.x on the other. That's false. If you're targetting a similar demographic as the Steam hardware surveys, you can support GL 3.x or above with 97% inclusiveness.

Direct3D has need of instancing, but we do not. We have plenty of glVertexAttrib calls.

Dude, Unreal Engine targets D3D9 hardware and above, CryEngine the same, Unity the same, Source the same, Frostbite is D3D11 only IIRC, and I'm pretty sure they do want to sell their games and engines, and do understand bussiness given Epic has been in it from the beginning.

And to be honest, if you can choose OpenGL 1.1 as a realistic target, you're not doing anything complex graphics wise to begin with. You could probably use a software renderer if you wanted to.

you have totally right, i am using software renderer since two years by now in my newer works.


according to my own investigation, there is still a half billion notebook/netbook with intel gma 9xx is out there. that only supports opengl 1.x.

I suppose that my question is whether those laptops are used for gaming, and if they are, whether they're used for the sort of gaming that the developer in question has in mind. If, for example, I were working on a fast-paced first-person-shooter, do the users of those devices tend to play such games? My guess (And I do stand to be corrected on this) is that users of such devices more likely either don't use them for gaming, or play casual games in spare time, and would likely not be in the audience for a first-person-shooter, or an in-depth RPG, etc.

i guess casual gamers usually play with simply 2d games mostly, or games in web browser, or facebook, or flash, in this case you cant rely on proper 3d support at all. fps players usually have decent computers, however, i dont have any statistics about this. rpg players are the typical 2d players with they 12 year old pentium4.

according to my own investigation, there is still a half billion notebook/netbook with intel gma 9xx is out there. that only supports opengl 1.x. who not supports it, losts 20-40% of his revenue. of course if somebody does not want to understand business, or does not want to sell his games, well then he even can use opengl 4, closing himself out from the 99,9% of the market. thats how the real world works outside the dreamworld, folks. (yes, my engine is crashing on most intel igp-s too... however, at least, i tried to fix it several times)

No, that's not the way things work.

If you don't support GL 1.x you don't suddenly and automatically lose 20% to 40% of your revenue, because those Intel equipped notebooks may not have even been potential customers. That's why we always say "research your target audience" in response to questions like this. If somebody was never going to buy your game in the first place, they're not a potential customer and they're not a lost sale.

You're also claiming that the choice is between supporting GL 1.x on the one hand, versus only supporting GL 4.x on the other. That's false. If you're targetting a similar demographic as the Steam hardware surveys, you can support GL 3.x or above with 97% inclusiveness.

if i look at my support mail adress, it looks like they are potentional customers. imho targetting only gamer configurations is potentionally an economic suicide.

if i would have a time machine, i would just pull a line at opengl 1.1, and would never even touch any extension or feature above that. both my life and my users life had been much easyer than. folowing the technology blindly does not always necessary means more success, the evolution of opengl is a typical example of this.

the only modern feature that people want, in my oppinion, is real time shadows.

however, shadow nap require newer opengl versions, hardware, extensions (1.4/2.0).

maybe its a good idea to add this only extension, if its available, while keeping the rest of the engine solidly on opengl 1.1.

but i alreday switched to software rendering, so i dont really care about graphics drivers, i started to throw out my graphics test-hardwares too. since that, i worry much less, i can code much less, and i get zero compatibility problems.

This topic is closed to new replies.

Advertisement