Advertisement

Outsourced audio implementation?

Started by August 31, 2009 01:15 AM
3 comments, last by Jay Taylor 15 years, 2 months ago
Hello. Hoping to try and get a bit of insight into how often audio implementation is outsourced. When a company outsources their audio requirements, is it much more common they only outsource the audio assets (ie, sound effects, foley, dialogue, music) and have their internal stuff implement, or is it just as common to also outsource the actual implementation of assets into the audio engine? Thanks for your help! Jay
It really depends on the size and complexity of the project.

It is more common they only outsource the actual audio assets. However with tools like WWise, and FMOD designer, audio designers can provide technical implementation. This still requires a high level of communication between the audio provider and the developer and someone who knows what they're doing technically.

Since both tools use an event driven audio hook once a hook is integrated into the game, the audio provider could then change behavior of the audio if necessary. Combine this with parameters passed to the audio engine from the game engine, complex implementations can be done off site.

The audio provider would need access to the game or perhaps would provide a rough implementation that the developer could then fine tune on site.

I've done this very thing on various projects and delivered the assets as a package with implementation. The client simply had to call the right sound hooks and send the right values to the game engine and voila, they had interactive sound design.
Game Audio Professional
www.GroovyAudio.com
Advertisement
Thanks! So you might create the hooks in a package, such as event_guard_footsteps_cement, with a randomised container of a few footsteps and pitch randomisation, whatever is needed, send that to the developer, and they can create the events in the game and just link it to those hooks, correct?

Rather than necesarily waiting till they have all the gameplay done, with events, then you bring in the hooks and tie them together after.

exactly.

the implementation on the developer side would be tagging of animations.

In tools like WWise you can set up things like switches - so that Event_footstep is just that, play a footstep sound.

You could also provide a switch - dirt, concrete, glass, metal, water - which the engineers could call on surface type change.

Inside your complex containers you'd have a switch container with the correct footstep randomization containers set up.

Boom, instant footstep system. Minimal back and forth between you and engineers, minimal work for engineers, both groups don't have to depend on each other to get work done.

Game Audio Professional
www.GroovyAudio.com
Makes perfect sense there dude, thanks.

This topic is closed to new replies.

Advertisement