Hacking the default Avatar


I am looking at making a simple avatar with no articulated parts, but I want a speech indicator that reacts to microphone level in a similar way to how the default avatar’s mouth does.

Can someone point me to the resources responsible for this behaviour?

Also, is the default avatar’s mouth a changing-texture or a mesh-morph? (changing a texture suits me, though if what I want to do only works with a morph, I can likely work with that).



This seems an unfair thing to point you to, but, as far as I know, this is the way what you want is done… there seems to be no simple faux mouth movement idea - it’s all or nothing if I got it right.



Thanks @OmegaHeron.

The avatar I am invisioning has a single large eye and I was thinking of changing the texture of the iris as a speech indicator, but dilating/contracting the iris to the JawOpen key might work (might be even better!).


Cool! If it doesn’t complain about only one shape being defined you might have it made.


I guess if it does, I can put in a bunch of null shape morph targets that exist but don’t change anything.

If you want to create a 1:1 mapping of shapes to map to Faceshifts, you will need to create shapes for
… indicates this probably won’t be necessary.



Cool idea.

Note that you do not need to implement all the blendshapes for the face. For you particular avatar, you could implement only the jawOpen blendshape and allow the audio trigger to animate it. You can ignore all the other shapes.


Okay! I am making good progress reverse-engineering the default avatar meshes (everything is now in place, though it is likely I have all the bones messed up if they ever need to move!). I decided to just use the glow-on-speak functionality as a speech indicator, so the single eyeball is the only thing that needs to move anyway (attached to the left-eye bone).

One thing I am having difficulty with is my name plaque has disappeared. Does that get attached to a bone somewhere or something, or is my avatar messing something up?