OK, am back on this. RL distractions cleared, I have time on my side again - back to the fun stuff!
Standard HiFy rig
I’ve been having a good play around with rigify and the advanced pitchpoyrig rig. The pitchpoyrig looks really good - lots of spine bones, roll bones throughout, etc, etc. I don’t think the unusual rigify thing of parenting the thumb joint to the IndexFinger will be a problem, but it will need testing. I got a standard rigify rig up and running yesterday, and intend to tackle skinning a pitchpoyrig one to see if I can animate it in HiFy. If so, I’ll put it up publicly somewhere so it can be tested (particularly with animation controlling input devices - faceshift, datagloves, hydras).
If anyone has any info on expected joint names for input devices, please let me know asap and I’ll make sure any required naming conventions are followed.
Once thoroughly tested, tweaked and trusted, I guess we’ll be able to accept the pitchpoyrig as the standard rig for HiFy, job done 
However, we’ll need to fully test importing animation data before that happens, and that requires being able to access (via JS API) the Hips translation data from the the FBX, which unless I’ve missed something, isn’t currently possible…
Procedural animation and DSP
Procedural animation is new tech to me, and I’m very much interested in getting something up and running. I did some experiments the other night towards the conversion of existing mocap data to procedural animation data (in this case, a short series of sine waves and associated phase values). Obviously, much of the nuance in the mocap data would be lost, but the tradeoff in file size and blending capabilities makes it very much worth further investigation, particularly for walking, idle, sitting type animations. I don’t think the technology will prove suitable for longer, more complex sequences like dances.
I found a very easy to use dsp (digital signal processing) javascript library the other night, and got it to extract some fundamental frequencies from some random animation data using FFT. It all works nicely and could be used as a basis for uploading and converting animation files using JS in Interface.
But again, production of a fully working prototype is currently not possible, because there is currently no way of accessing the FBX translation data in the JS API - it’s very hard to walk without swaying one’s hips!
Missing animation translation data in JS
Basically, everything I’m trying to do is hampered by the inaccessibility of Hips FBX translation data, as described in posts above and demonstrated here: http://www.youtube.com/watch?v=iiQfqBNp0oM
As a result, I’m going to have a look at fixing it myself. I’ve had a good look through the source code for Interface and I think I can do it. I will put it up as a worklist suggestion and bid on it if it still looks like it’s within my capabilities after I’ve dug into the detail a bit further.
Quick shout to @Andrzej - I see you wrote most of the existing animation code in Interface. If there is any reason as to why access to Hips translation data in the JS API has not been implemented, please could you let me know asap so I don’t waste any time on it!
Ragdoll
@Judas - you mentioned ragdoll animation as a new feature in another post. I enabled the
Developer -> Avatar Options -> Collide As Ragdoll
setting briefly last night, but couldn’t see any differences when I collided with voxels. Could you point me in the right direction to see where we’re at with that? I’m particularly interested in how it blends / replaces other sources of animation…
Cheers!