Lag and amount of models in a domain and octree optimization question


I am wondering if a domain can contain a lot of models in different areas, think about a domain representing the Dante’s Divine Comedy (this is actually really existent in our educational sims in OpenSim Craft-World and in Edmondo educational network). With the three big scene (Hell, Purgatory, Paradise) composed by multiple sub-scenes.
This might involve various GB of .fbx and texture data, collected from a different http server.

Having compiled HiFi from scratch various time, I’ve seen that HiFi is making use of Octree abstraction which seems a way to limit the total amount of data that is requested to be rendered to the client to only the one very close to the camera, leaving the farthest data coalesced in some kind of blurred “lod” semplification.

If this is true, than we can have potentially 100G of data in a domain and client don’t have real lag if the various scene are quite distant one from the other so that they don’t need to be rendered fully, only the nearest scene is rendered.

This is quite different from the Sansar approach where each domain needs to be totally downloaded to be rendered in its entirety, and in my opinion if HF works in the way I explained would be a quite good architectural bonus to optimize bandwidth, and avoid to generate hundreds of domains for rendering the Divine Comedy.

Would some developer architect confirm me if this is happening in a similar way, or if it is better to split a very big scene in various domains?

Thanks for any insight on this question :slight_smile:

LOD, Clipping plane, Draw distance and stuff

@claudio.pacchiega you are correct, all entities on the server (and locally) are stored in an Octree.

Historically, the Octree is here because High Fidelity used to be Voxel Based.
See this early Hifi screenshot circa summer 2013:

You can see the automatic LOD doing some voxel averaging on the right side (The path doesn’t have steps, it’s just that the averaged voxel is bigger)
And all that happened on the server, so that was a good was to reduce the amount of data you had to send.

Fast forward a few years. We still use octrees on both the entity server and the client. All entities is stored are stored in the octree at the appropriate level based on the max bounding box for the object.

On the server, this lets us very quickly recurse the only part of the tree that matters based on the view frustum that is sent by each of the connected clients. And the further from the point of view we traverse, the less we need to recurse deeply because the object we’ll hit will be too small to see anyway.
This basically guarantees we don’t send to the clients anything they can’t see.

So yes, you could basically have TBs of data in your server, but as long as only a reasonable amount of data is within the clients’ field of view, that should be just fine.

- Clement (aka Atlante45 in that screenshot)