Problem: Create a procedural walk animation for avatars in HiFi that allows them to correctly plant their feet without slipping when moving on voxels beneath them.
Research towards solution:
Perhaps we could utilize some of David Rosen’s Hybrid Procedural and Physics-based Animation in OVERGROWTH that he outlined in this interview about his animating/coding (he is the only animator & coder of a 4 man team in Wolfire Games) on the new indie-game Overgrowth so far where he achieves great looking state transitions between linear and angular vectors with very low overhead and without animations!
Utilize some of Rune Skovbo Johansen’s concepts outlined in his Masters Thesis Automated Semi-Procedural Animation for Character Locomotion, way back in 2009 for the Department of Information and Media Studies @ Aarhus University, and perhaps his locomotion system for automated motion analysis in Maya. (seen here, here, and on his thesis page ad naseum) However animating like this may be entirely unnecessary and out-dated. IDK yet.
Thinking way outside the box: I remember an IEEE 2010 International Conference on Cyberworlds paper by Zhiqiang Luo .et al. called Building Hand Motion-Based Character Animation: The Case of Puppetry [see jpegs of pdf] where they described a means by which motion capture data and procedural animation are triggered by a HCI (Smart Glove in this case) to, “…either activate the designed procedural animation through motion recognition or tune the parameters of the procedural animation to build the new motion, which allows the direct user control on the animation” utilizing their, “IMHAP (Intelligent Media Lab’s Humanoid Animation Platform). - The structure of the system is based on the model-view-control (MVC) design pattern. In MVC design pattern, the system is divided into three main models: model, view and controller. Model is used for encapsulate data; view is used for interactive with the user; and controller is the communication bridge between the former two.” [fig.1]
Of course the control in this case could be gesture based via markerless motion capture technology or web cam interpolation today and the puppet…well you get the idea.
Here’s a few pages of this wacky conference paper. Message me and I can send the rest if you are interested:
I’m just brain storming right now. Maybe a combo of all 3, depending on the runtime and physics handling, maybe none of it. IDK.
Go team HiFi!