Animation Test for Avatar Motion


#1

Here’s a simple test for the avatar motion. It includes idle, turn and jump. The animations were downloaded from Mixamo and edited in Maya. Ultimately, we’d like to abstract the motion and integrate it into the Interface navigation controls. We’re thinking of creating the concept of a runtime rig which would be comparable to an IK rig.

Enjoy!

-Ozan


#2

Could you elaborate a bit more what this means? I don’t wish to assume/guess. I guess my question is - would that lead one not to spend time refining weighting to fix some of the more noticeable (but fixable) AV deformation issues with current skeleton and weighting? I don’t want to burn a lot of hours (or see anyone else do so) on something that may have zero use ultimately.


#3

I second the statements made by OmegaHeron. Please let us know. Thanks.


#4

Good question. Fear not. The proposed runtime rig would be totally compatible with the Mixamo rig. The purpose of the runtime to rig is to provide our engineers a simple way to coordinate animation cycles with some procedural actions. It’s an abstraction layer that functions under the hood at a C++ layers. Nothing that affects us as we construct our assets and do our skin weighting. In fact, the test above was done with the standard mixamo rig.


#5

Just curious - are you thinking of blending influences on the fly over the existing animations - kind of like a game engine? blend in waves, kisses and other gestures without neccessarily interrupting the facial and gesture capture, movement and other animations?

Was just thinking it could be fun to have some simple emoticon tools to access quick gestures etc.


#6

Thank you @ozan - most helpful and encouraging. Now back to fixing the neck when looking up at maximal head tilts - that one makes me cringe as much as the SL/OS AV folding under chest. : )


#7

Agreed. Yes, we’ll want to blend animations from several inputs, including keyframe, mocap and physics engine. Good idea.

-Ozan