Advertisement

[SIGGRAPH 2018] Mode-Adaptive Neural Networks for Quadruped Motion Control

Started by May 13, 2018 04:35 PM
10 comments, last by IADaveMark 6 years, 5 months ago

Animating characters is a pain, right? Especially those four-legged monsters!

This year, we will be presenting our recent research on quadruped animation and character control at the SIGGRAPH 2018 in Vancouver. The system can produce natural animations from real motion data using artificial neural networks. Our system is implemented in the Unity 3D engine and trained with TensorFlow.

If you are curious about such things, have a look at this:

 

Awesome! Is the 3ms you show the required runtime for the ANN to generate motion from an input as seen on the buttons on top of the screen? / Do you think it can become faster?

Personally i think the result is worth the time, but game designers should start to think about how this tech can create new gaming experiences while dealing with limitations about quantity.

Advertisement

No, this is the m/s (meters per second) as a user-control to specify how fast the character should move. The locomotion gait then automatically results from this ;)

Ouch, haha :)

But some performance numbers would be very interesting for game devs! (it's not mentioned int the paper either).

I work on physical simulation of bipeds. Here balancing is such a tight constraint that natural locomotion comes out automatically and no machine learning seems necessary for the basics. But for more complex motion beyond walking / running i think both approaches need to be combined.

Yeah, these guys at UBC do some nice work - just look for DeepMimic, which will be presented at the same venue where we are presenting our quadruped research. However, the computational requirement for the network is about 2ms per frame on a gaming laptop. It runs on the CPU right now with the Eigen library - running it on the GPU gives a significant speedup of course. From what I have heard, NVIDIA made a demo using the PFNN on humans with ~100 characters in parallel - so I think it's already applicable for practical use :)

It's really amazing to see how animation tools are evolving, can't wait to see what games can be made from this.

So what is the expected and hoped for path that this awesome software will take after SIGGRAPH? Are you planing on making your own software with this or is your aim for it to be used directly with game engines?

Looks brilliant.

Advertisement

I'm just working on my PhD, that's all ;) However, I think there is a good chance that people might want to integrate this because it's directly developed inside Unity. We will continue researching on other relevant projects for character animation using AI techniques - so let's see... :)

Hi Sebastian, this is very interesting! Is there an electronic version of the paper that we can read? I would like to learn more. Thanks!

Yeah, the paper is uploaded on https://github.com/sebastianstarke/AI4Animation

That's awesome! Thank you very much!

This topic is closed to new replies.

Advertisement