Advertisement

how are game animations blended together?

Started by May 08, 2020 01:23 PM
7 comments, last by abhinav789singh 4 years, 6 months ago

How do games blend together different animations? What I mean by blending is this. Suppose

we are in a fps game and we jump and shoot at the same time. How does both the animations get played together without affecting each other. Another example would be this. In a third person game the character responds with to inputs almost instantly by playing many animations. In one instance the player was moving to the left with full speed and now it is trying to change its direction by slowing himself down and rotating its body. How does something like this is achieved.

None

In case you didn't realize it, there is a CPU (possibly even more CPUS) in the system computing positions of everything for every frame that you see. If they interact, it computes the interaction as well, just like a physics sub-system that computes speeds and collisions.

In other words, it's not a fixed movie, it's computed in real time.

Advertisement

@Alberth i know that, i want to know how different animations are blended together. see for example in a fps game, the reload animation is not procedural yet it blends with other animations like sliding, jumping etc etc smoothly

None

Typically, each movement animation indexes through one “cycle” of movement. Animations are typically expressed as rotation, and perhaps translation, of “bones” onto which the “body” is rigged. When you move/bend a “bone,” the vertices and textures for the body part bound to that bone, move accordingly.

If you half-run, half-walk, both the “walk” and “run” animations are played, cycled through at the same relative speed, so they cycle at the same time.

Then, the joint bend values are blended between the two – the “left knee” might bend 22 degrees for the walk cycle, but 38 degrees for the run cycle, so in the “blended” animation case, if the weights are 50/50, the left knee will bend 30 degrees at that point.

Once you have the correct pose for each bone, the vertices are put onto the bones using a “skinning” process. This happens for each character, for each frame that's rendered.

The cool thing with this is that you can let other things bend the joints between the bones, too. If a weapon would stick into a wall, the game can detect this with collision detection, and bend up the arms so the weapon is moved out of the way, for example.

Certain blend modes may say that particular bones (say, the arms and fingers) are driven by some particular animation (reloading,) whereas other bones (say, the legs and torso) are driven by running and turning. Exactly how all of this is blended together, depends on the game engine, and the particular game – good engines have quite elaborate animation blending setups. See for example Unreal Engine Animation Blending.

enum Bool { True, False, FileNotFound };

ok that makes a lot of sense. But I am still confused about how exactly we assign the weights. Do we have to assign weights to all the bones for every animation?

None

Click_Clock_Boom said:
ok that makes a lot of sense. But I am still confused about how exactly we assign the weights. Do we have to assign weights to all the bones for every animation?

If you have animations for different parts of the body, ie. one for walking and the other for turning your upper body, typically only a certain number of bones is controlled by each animation. So the bones in the arms are only in the upper-body animation, and the ones in the legs are only in the walk-animation. For bones that are shared by both animations, you can blend/lerp between both weights.

Advertisement

Try examining a rigged figure in Poser or DAZ 3D Studio and you will learn alot about how they are weight-mapped. Figures can be one rigged as single group with complex weight mapping applied or body joints can be individual groups linked using simpler parametric rigging and weight mapped skin. Positional poses can be saved as “frames” or animations saved with the figure (fbx) , even exported separate as BVH key frames. The rendering engines apply animation blending/synchronize curves as dynamic proportional timing constraints to each rotation to make it look natural and smooth as desired.

3DSkyDome.com animated sky boxes and instant 3d Android & WebGL publishing.

okay thank you for explaining it so well

None

This topic is closed to new replies.

Advertisement