Advertisement

Game programmers less important in the future?

Started by August 29, 2005 01:42 PM
22 comments, last by EvilDecl81 19 years, 2 months ago
I should clarify that by programmer, I was not including software architects. Those guys should be around forever.
--Sqorgarhttp://gamebastard.blogspot.com
Quote: Original post by Anonymous Poster
Actually, what I've seen is not a decrease in programmers, but an increase in "content creators" required for each project. The number of programmers hasn't really changed over time. What they've been doing has changed. You have to have the current skill set.


Yep. To expand on that, content creation increase will lead to more sophisticated and varied programming demand. Tools maturation may allow tier by tier ease of development, for example, in five years you might be able to drag and drop a very simple game together in 2D in sixteen colors or something (for the nostalgia market, possibly), but content creation is a function of imagination and discipline, and increases in those elements of entertainment will require the creativity and expertice of programmers to remain in demand.

Adventuredesign

Always without desire we must be found, If its deep mystery we would sound; But if desire always within us be, Its outer fringe is all that we shall see. - The Tao

Advertisement
The more complex and complete the middleware and tools get, the more complex and complete the games will become, thus having the same need for programmers as today. I think it's a cycle...

Thinking that in the future there will be no need for programmers is the same as a guy back in the 1970's thinking that today there would be no need for them either.

Evolution goes all ways - the software has evolved, the development process has evolved, the programmers have evolved. We're all still here. We just make less and less money. [lol]
- blew
Consider this: In the past, companies A and B will both have made technology X themselves, each employing a team of five programmers to do so, for a total of ten programmers. Nowadays, they may well buy technology X from a middleware company, which employs the five programmers required to create the tech plus maybe a couple extra to work on rapid bug response and tech support. An increase in sharing of tech means a decrease in the number of people creating tech, simple as that.

Richard "Superpig" Fine - saving pigs from untimely fates - Microsoft DirectX MVP 2006/2007/2008/2009
"Shaders are not meant to do everything. Of course you can try to use it for everything, but it's like playing football using cabbage." - MickeyMouse

Its a competitive market, and competition is not just on price, but also on quality. If everyone uses the same middleware, then any company in such a market has an incentive to stop using it, or modify it to get an edge over all the competition. If my competition uses a certain middleware, with its limited set of features, I have an incentive to produce a better product, either by using a different, better middleware or by customizing the middleware, or by hiring my own programmers. This competition demands programmers, probably higly skilled ones, either in middleware companies or independant developers.

Also, a middleware company will need more people than an average game development company, simply because middleware, when widely used, will need to be extremely general purpose. For any game engine, anyone with enough imagination can conjure up a game that it cannot be used to implement efficiently. Also, it will require large amounts of ordinary programmers to maintain and bug fix an aging code base.

Lets not forget all the developments in hardware. No middleware is going to stay relevant forever without updates. Somebody will have to do this.
Quote: Original post by superpig
Consider this: In the past, companies A and B will both have made technology X themselves, each employing a team of five programmers to do so, for a total of ten programmers. Nowadays, they may well buy technology X from a middleware company, which employs the five programmers required to create the tech plus maybe a couple extra to work on rapid bug response and tech support. An increase in sharing of tech means a decrease in the number of people creating tech, simple as that.


But both companies A and B will still most likely be employing 5 programmers each; they'll just be doing new things that they didn't have time to do before.

Middleware doesn't replace programmers. It just allows you to pack more into your game to meet the players ever increasing demand for more shiny new features.
Advertisement
Quote: Original post by WillC
Middleware doesn't replace programmers.


Ain't that the truth. Where I work we have programmers who are specialized in using and extending the middleware. It's a decent job and it allows us to do a lot more game with our money.
Quote: Original post by WillC
Quote: Original post by superpig
Consider this: In the past, companies A and B will both have made technology X themselves, each employing a team of five programmers to do so, for a total of ten programmers. Nowadays, they may well buy technology X from a middleware company, which employs the five programmers required to create the tech plus maybe a couple extra to work on rapid bug response and tech support. An increase in sharing of tech means a decrease in the number of people creating tech, simple as that.


But both companies A and B will still most likely be employing 5 programmers each; they'll just be doing new things that they didn't have time to do before.

Middleware doesn't replace programmers. It just allows you to pack more into your game to meet the players ever increasing demand for more shiny new features.


I never said it "replaced programmers" in general, merely that it removed the need for programmers who would previously have been making that tech themselves (cue John Hattan). It's still possible that, as you say, there will be other things for them to do.

Personally, I'm not convinced that there will be in the long run; middleware vendors are working on improving integration with each other, working on ease of use and extending their tech themselves; eventually the per-studio programmer who is responsible for "handling library X" will become less necessary because the guy who is responsible for "handling library Y" has enough free time that he could easily do both. Meanwhile, games themselves aren't getting significantly more complex (and there's an upper limit in the form of what players can handle) so there's no real increase in the amount of 'gameplay code' to be written. Granted, people usually find themselves cutting features due to tight schedules, and the continued pressure on studios for faster turnaround times may create jobs for a while yet. But it can only last so long.

Anyway, I've got little to no evidence to back up my opinion so I won't bother trying to argue it [smile]

Richard "Superpig" Fine - saving pigs from untimely fates - Microsoft DirectX MVP 2006/2007/2008/2009
"Shaders are not meant to do everything. Of course you can try to use it for everything, but it's like playing football using cabbage." - MickeyMouse

Quote: Original post by superpig
Personally, I'm not convinced that there will be in the long run; middleware vendors are working on improving integration with each other, working on ease of use and extending their tech themselves; eventually the per-studio programmer who is responsible for "handling library X" will become less necessary because the guy who is responsible for "handling library Y" has enough free time that he could easily do both. Meanwhile, games themselves aren't getting significantly more complex (and there's an upper limit in the form of what players can handle) so there's no real increase in the amount of 'gameplay code' to be written. Granted, people usually find themselves cutting features due to tight schedules, and the continued pressure on studios for faster turnaround times may create jobs for a while yet. But it can only last so long.

Anyway, I've got little to no evidence to back up my opinion so I won't bother trying to argue it [smile]


I'm going to disagree with you, just to be awkward [smile]. I'm not going to say you're wrong, because we're speculating about the future, and you could well be right, but just that from my personal experience I don't think this will happen anytime soon. If you're talking about in 15-100 years time, then sure, it may all be painting-games-by-numbers; but not within the next 5-15 years.

The reason I say this, is that I see no current trend for it now. I make console games, and have done ever since the PS1. With each iteration of hardware the teams get bigger, the games get more complex, the game code gets way more complex, and reliance on pre-made libaries and middleware increases. This trend is continuing with next-gen hardware too, and I don't see it ending anytime soon.

Dispite what you may read in the middleware developers marketing PDFs there is NO middleware anywhere that just plugs into a game and goes. They all require extensive hacking and changing to fit a given game, and this needs programmers to do that work. There's no sign of this changing anytime in the near future. Middleware companies may promise it, but until someone actually delivers something even close to this dream, I will have to remain sceptical.
Quote: Meanwhile, games themselves aren't getting significantly more complex (and there's an upper limit in the form of what players can handle) so there's no real increase in the amount of 'gameplay code' to be written.
(My italics)
We are far from the upper limit in terms of complexity, therefore your conclusion is not necessarily true. Think about the shockingly poor AI present in most modern games, for instance. Middleware is coming out for that, but at some point, designers will have to effectively be programmers to implement interesting AI and player interaction.

This topic is closed to new replies.

Advertisement