Advertisement

Should I support D3D9/OpenGL 2.x hardware, or not?

Started by December 19, 2014 05:12 AM
24 comments, last by blueshogun96 10 years ago

I would tend to believe the Steam hardware surveys. Those are the people that actually buy PC games, so that would strike me as the target demographic that you'd want to support.

DirectX9 was released TWELVE YEARS AGO. DirectX10 is eight years old. DirectX11 is five years old. In my experience, the lifespan of a consumer-grade laptop can charitably be estimated at five years. The last two bargain-bin BestBuy-type laptops I have bought for <$400 have had DX10 and DX11 capable Intel integrated graphics chips, respectively.

A lot of commercial games are built using engines originating back in the D3D9 era or earlier, so they were born with support for what is now ancient hardware. Source goes back to the early 2000s (arguably earlier, depending on how much of the GoldSrc and even IdTech1 code is still around), Unreal 4, CryEngine and Unity started development in the mid 2000s. If I were building a new engine and game today, I'm not sure I would bother supporting anything less than DX11 - by the time I finished, DX12 will be out.

Eric Richards

SlimDX tutorials - http://www.richardssoftware.net/

Twitter - @EricRichards22

folowing the technology blindly does not always necessary means more success

Again you're inventing an objection where one doesn't exist. Nobody is saying "follow the technology blindly".

What's particularly ironic is that you've switched to software rendering, meaning that you now have a much higher CPU requirement than if you were using hardware. So despite your intentions you're raising the entry level for your users anyway.

Direct3D has need of instancing, but we do not. We have plenty of glVertexAttrib calls.

Advertisement

If I were building a new engine and game today, I'm not sure I would bother supporting anything less than DX11

your decide, your job, your business, your responsibility.


What's particularly ironic is that you've switched to software rendering, meaning that you now have a much higher CPU requirement than if you were using hardware. So despite your intentions you're raising the entry level for your users anyway.

perfect solutions never exist - moving to better solutions again and again, thats the correct response.

yeah, folowing new and new 3d apis blindly are a bad, blind choice.


In my experience, the lifespan ... can charitably be estimated at five years

It depends on the market they're in.

Avid games will update their computers more frequently, and when they do, they tend to buy more advanced hardware.

Corporate environments vary. Corporate environments with lots of tech workers are going to buy new machines more frequently. From the companies I've got some perspective into thanks to contract work, it looks their current development machines are replaced at about 5 years old, their normal business machines (basically running Outlook and Word and Chrome) are on a longer cycle. At one of those companies they finally weaned off of XP and the oldest machines are from circa 2002.

Personally for my home machine I've been running on a Q6600 up until this Christmas, when I bought myself a nice new 4790K. That machine plays all their favorite games quite well, it just doesn't keep up with Dragon Age Inquisition and a few others I've recently acquired. I've passed that older machine to my teenagers who will likely use it for another 3-4 years. Over the years I've only added storage space and a newer graphics card. That machine is almost 7 years old, and will likely have a useful life of about 10-12 years.

Finally, I know some old neighbors who were running on Windows 98 on a machine bought back in 1999 until their kids gave Grandma a present of a new computer two years ago. Those machines were low-end when they were bought, but are adequate for their purposes.

That's why I wrote back at the beginning that defining your target market is important.

You might say "I want to define the Q6600, GeForce 8800, and 4GB memory as my key demographic." That was a high spec machine back in January 2007 when it was released. It was the first round of D3D10 class hardware. It is more powerful than you will find where kids go visit their grandparents. Even after all these years it is also more powerful than you will find in many office environments.

You may target a 2010 machine, perhaps the i5 2300, GeForce 480, and 4GB memory. That was a high spec back in 2010. Based on the Steam Hardware Survey that is better than half of all Steam users.

If your goal is to make a game that kids can download and play on old equipment, play when they visit grandparents, play on the machines that are often handed down through two or three other people before being made available to them, then you'll still want to target DX9 class hardware with dual core around 2GHz and 2GB memory.

If your goal is to make a game that people can play on moderately fast machines, or on machines that have been handed down a single time, then target DX10 class graphics with either dual or quad core around 2.5 GHz and 4GB memory.

If your goal is high end modern game (probably far beyond what a hobby developer can build) then feel free to require machines built in the last two years.

Again: Define your intended market. Your requirements will follow.

Defining your target market is important, but also define the game you want to actually make. While most hobby developers aren't going to be able to really push at the cutting edge of processing (if they're reasonably careful with their algorithm selection/design), there are still a handful of people out there who are able to program an efficient set of software that will bring large and powerful systems to their knees during run time.

So questions to ask before deciding what to support:

Who do you want to play the game? What kind of hardware do they have access to?

What do you want your game to do? What hardware and software features of the machine are you going to need? If your AI system is going to need a big chunk of the latest i7 processor, then you likely aren't going to have to worry about supporting graphics card hardware from a dozen years ago.

Then turn around and ask questions from the other end of things: What do you gain by supporting D3D9?

> Will it give you a noticeable advantage in potential market share? Lots of AAA titles seem to be doing just fine without it and have been for years already.

> What will it require of you to support it? More coding involved is the obvious answer, but don't forget about all the extra testing and support issues it brings. You effectively more than double your QA process and hardware requirements if you want to cover a reasonable spread in house. And you should never forget your QA needs. Good and effective QA can greatly improve your development process and save you ages of coding time if you engage in it early on.

Old Username: Talroth
If your signature on a web forum takes up more space than your average post, then you are doing things wrong.

Well, there's one big issue with supporting ancient APIs like OpenGL 1.x. Hardware vendors are not actively checking their drivers against legacy versions of OpenGL, especially DirectX. It's increasingly common to find that some games from those eras are completely broken visually, or have poor performance otherwise. I wrote a bullet hell game using OpenGL 1.1 assuming that since it was simply a 2D game, I wouldn't need anything more than that. Upon retrospect, I doubt that it was a good idea. Now that I am running it on Windows 8, the speeds are below standard (45fps is not good for this genre), and intel GPUs have horrible OpenGL drivers anyway. One of my spline design tools I wrote for that game was completely broken; none of the lines would render. Even worse, it failed to draw sprites with pow2 textures correctly. Legacy DirectX is obviously worse. I have trouble running certain DirectDraw/Direct3D7 and older games on modern hardware (Alien versus Predator for PC was a very good example). In many cases, DirectDraw initialization completely fails. That being said, the APIs evolve with the hardware.

The majority of serious gamers have upgraded, but there's a few out there that still have older laptops. Some people in eastern Europe can't afford to make the jump as of yet, and I plan on giving away one of my NVIDIA based laptops to an eastern European friend who cannot afford a better machine that supports core OpenGL.

For Direct3D, I'll just allow the fallback to 9.1 functionality for Direct3D11, and *maybe* I'll do OpenGL 2.x support later on.

Shogun.

This topic is closed to new replies.

Advertisement