Yea the 100gb does sound suspect.
And yeah prim limits were urgh in second life, considering that the prim models were localized models just re-repeated in a XML tree that define locations. So they were quite light for cards.
However texture size limitations did have their place. I’d dare to say they might have been too lenient in SL, especially when people started to bring in 4k textures through 3rd party viewers. Thank fully HF does downside textures to 1024 when going to cache, but there still will be people who will put ridicolous sized textures on servers because they do not know any better (yeah sure it’s their money)
But as in discussed with OmegaHeron
The limitation definetly should not be set in stone, but adjustable by the end users.
But there should be a default of some value, and it should be client side (as the client is the one loading the models) . So if one has a good connection then they should be completely possible for them to download all they want.
Nobody wants to be surprised by someone walking around with an avatar with a total value of at 300 mb. ( hogging both your bandwidth for a few seconds and then occupying your GPU memory).
In fact this seems extremely exploitable and we might see griefers use this technique to load up a nearly tiny 1 gb junk model entities and slap the model to a disposable drop box account and the blocking the dropbox host on their end (to avoid loading the file), and then rezing the object in world and forcing everyone to at least download it. forcing everyone’s bandwidth to solely focus on the file, and when rezed occupy their memory.
Sure people should able to do so, but only for those who have the capability to handle such requirement who can turn off the limitation.
However that aside keeping models and textures at appriate size affects everyone: from backend services hosting the files (and bandwidth per download)
To downloading and rendering on the front end (aka client)
Optimization of models should be encouraged we always will be limited via hardware capabilities. Many game engines spend lots of time on optimizations, even to get stuff running on the latest hardware. Artists tend to create too much detail but during that engineers tend to say no optimize.
Infact we are still missing LOD models on entity models.
We have to remember that model optimization comes also benefit of file size.
So easiest way to encourage this would to create a flexible cap that can be adjusted or removed by the end user.
You can still do a lot of things with the default(and changeable) 25mb total sized per model limit I suggested. (In fact the avatar I made only hits 2mb with textures, blendshapes and joints)
@Judas yeah the decimation work is interesting, but it will not solve the issue of bandwidth as the model has to be first loaded to do the calculations.