Meetup - 19th Feb @ 2pm PST


#1

Hi All,

Meetup today will be at hifi://sandbox/winter . Agenda.

  • Particles
  • Leap
  • Vive Pre
  • Bugs
  • General Banter

I will stream the feed.

Chris


#2

Please, please, please can someone stream this on Twitch? Or YouTube? Thanks.


#3

Can we discuss essential basic things we don’t have yet really need so people will like us?
i.e
I want to bloody sit down


#4

Yes, any word on when assemblies will happen? ref: https://alphas.highfidelity.io/t/couple-of-bugs-related-to-physics-and-camera/10227/4


#5

I hit my four crash limit in the meetup, so writing down what I tried to say:

What would be a good project to help newcomers get started is a product, which I will call “MakeAvatar”. It could be partly based on MakeHuman. It is something a person would grab from the marketplace and rez/instantiate in-world. That would create an avatar over the platform, have a set of controls to adjust the avatar’s look. Then when the person is happy they could have it send the metrics to them or right there update their avatar.

The point of this product is to make it dead easy for most people to set up their avatar. Custom designers would likely not use this product since they have their own tools and workflows to make their specialty avatars.

In any case, we also need to use as much as possible the standard humanoid skeleton @Menithal created. This permits animations to have maximal interoperability, something needed for general virtual worlds. Again, custom designers would do their own thing, use their own tools, albeit it at a much higher cost.

Another tool would be a .BVH or .anim file to standard skeleton converter. This could be a script or part of some animation applier product, and again this is to create a low entry barrier for people who have existing animations. In time, finer animations that understand and power all the extra bones will come to pass, but the point is to get content here on a faster trajectory.


#6

no sits no avatars, hmm where do we apply for Sansar, maybe they have an interest in the user experiance


#7

and I was doing so. albeit upside down ofcourse lol


#8

Forgot to ask, are the physics issues going to get addressed? (Velocities resetting, sometimes never applying, etc)


#9

Sansar = NDA, so YMMV.

Let’s talk sit and what the process entails. First from the highest abstraction.:

  1. You see a chair or a ledge on a nice water fountain.
  2. You walk right up to it, then bend at the knees and sit down.

Now let’s see how legacy grids or legacy virtual worlds and games do it:

  1. You see a virtual chair or virtual water fountain
  2. You hover your mouse pointer over the ‘object’, a little sit icon appears.
  3. You click on object and you are instantly snagged and positioned where the object’s preferred sit position is set.
    4a. A script applies a sitting animation onto your avatar, or,
    4b. The simulator applies a defult sitting animation onto your avatar.

One possible High Fidelity approach:

  1. You see a virtual chair or virtual water fountain
  2. You walk right up to the chair or ledge, bump into it
  3. A script in the object runs a turn-around and sit-down animation. It positions your avatar to its preferred sit position based on size and girth metrics.

So far this is an object doing something to your avatar. It could be turned around such that you have a device that understand a panoply of different kinds of avatar behaviors (or this is something that is an interface default, or it is something domain owners set up as a default). So:

  1. You see a virtual chair or virtual water fountain
  2. You walk right up to the chair or ledge, bump into it
  3. Your avatar behavior script (ABS) gets the collision event. It queries the object to see if it permits sitting, and if so where is a good spot. It also queries the object for a preferred animation to run. If none, it selects the ABS’s sit animation.
  4. the ABS runs a turn-around and sit-down animation. It positions your avatar to the preferred sit position obtained from the object. Lacking that data is does a best guess and sits you somewhere, or, maybe it runs a back-away-from-the-hostile-object animation.

I am thinking the last approach is best because it lets you choose the behaviors and animations you wish and it maximizes where you can sit.


#10

Video from the meetup. Video dropped at the end.


#11

My thoughts on sits

Ideally i want to take a chair I made sit my avatar on it in blender and create the animation
this part i can do
Then At the most basic level i want to drop that animation into the chair in the hf properties panel check the play animation on touch box and hey presto my avatar sits on the chair.
Going forward of course people will want to make elaborate multi animation multi person objects , these will probably live in some menacingly complex code to be created at a latter date

Once we have this basic functionality then we can use it to make doors and pretty much everything come alive on contact with it
which is a shit load less boring than what we have now.
The dancefloor at music seems way more complex a thing than it should be. It should be a thing that makes you dance when you walk on it and stops you dancing when you walk off it.
Also we need animation priority levels
so if your dancing or sitting and you have hand controllers they take over the required limbs
Beer


#12

I remember really liking just this sort of thing, … several years ago. I think it was in a little Mars thing on the Open Sim platform, but hard to remember exactly, as it was so long ago.