Guest Editorial: What's Wrong with High Fidelity

Guest Editorial: What’s Wrong with High Fidelity

2 Likes

High fidelity started already to crumple long before this. The user interface on desktop did not improve. Building where and am sure still is a pain. The never fixed font problems and where using in QT a blurry bad font.

High fidelity gave users more reasons to leave then to stay. Still sad it could have been a good platform.

3 Likes

I wonder if they ran out of funding yet

Well some sort of Open Source issue here. I think waiting to several external programmer (not content maker) isn t enough… It has been some Linux compiling problems on the beginning also. Man I wish I had some time to do some Gentoo ebuild in the past…

Not exactly. All their servers were AWS Ubuntu servers as far as I’m aware, so ensuring the Linux systems were up and running was a big deal. Not to mention, their in-house programmers were able to keep up with things.

A major issue was definitely road-mapping and content support issues. I strongly remember during one of the last townhall meetings that occurred in Maker where desktop microphone use was brought up and how people still never heard of headphones, so audio loopbacking was a very big annoyance. It was suggested to add microphone sensitivity options, since that’s what everyone else offers to help curve those issues, in addition to helping out those who have a lot of background noise. Instead, what was proposed was an AI audio system that could more intelligently duck the audio automatically by observing the audio for strictly the user’s voice, even against background noise, or something to that degree. So instead of starting out with something simple and then swapping it with the more advanced product, it was never completed and now we have nothing. Issues like this were very common and that just killed so many features.

On top of that, worlds don’t mean much if you don’t have content, and content support was terrible. What I mean is that both content creation required web hosting knowledge or at least some kind of server operation know how and comes with at least some kind of a price tag. Instead of funding popular locations that could easily allow people to host giant tier servers for a whole year, the money was sunk into stress tests disguised as contests or other events. From my estimate alone, the last event (the convention one) had to cost about $1,800 in server fees (including bandwidth) and at least $21,000 in reward money.for the 3 places (second place had two winners) plus all the participation awards ($300 each, and there were a lot of entries that got accepted).

Now take all that money and look at the cost of server hosting on say Digital Ocean (I’ll say that since they had a partner deal going with them). A good tier server with Digital Ocean is maybe the $80 server with 6 CPU cores, 16GB of ram, 320 GB of storage, and 6TB of bandwidth before additional chargers. While the storage is serious overkill, this does mean you can do more backups, which is always a good thing.Speaking from personal experiences, a humble 2 core system with just 2GB of ram can already do a lot and have about 15 to 20 people on without it showing any issues (this is on top of having a LEMP system for better hosting options on content like http compression options). So 6 cores is going to do wonders for only $960 a year.

Now imagine if High Fidelity took even a small portion of the reward money and went to people who were hosting popular locations (there were a good few) and asked them to bump up to a higher tier to ensure they could host a good collection of people and agree to keep the server online and up to date for at least 6 months in agreement to a small grant of $3000. That’s about $500 just for paying in hosting fees and $2500 in having polished content. Not only would there be good, example domains, but people who did participate now had funding to either purchase assets or find people to make the assets needed. Instead of creating spikes in population, it would create an actual foundation for a population to exist and mean that events would be met with possibly more people. That is a strong feeling I have where the focus was so short term in getting people in and so long term in adding features, it caused people to see an unfinished mess almost everywhere. Presentation is everything, and having people go to laggy events never wins over everyone. So much so is this true, that honestly, I almost never saw any of the avatar participants post contest.

You can argue security was also an issue, but in the end, it’s just a good chunk of issues all around that ended up being the end of High Fidelity’s initial chapter. A large scope was met with no solid roadmap, people were met with buggy, very unpolished experiences, a finances system that didn’t understand the demand of today, and a bill up front anytime you wanted to do anything. While some of these issues were addressed, they were addressed too late and the possible audience they could have had went to greener pastures that didn’t have any of those issues. While the new idea that is being focused has received criticisms, it’s also hard to not say it isn’t a good idea, since it allows them to refocus their scope to be more narrow, and for their dev team, that may be just the right size.

4 Likes

Well I like your words and explanations so much.
I didn’t have any big problems with the server stuff and I like the idea of the architecture the design and so on…
Massive problems on my side were in the “interface” program section, on the Linux side…

I am convinced about the High Fidelity construct, the implementation and open source implementation absolutely.
BUT I am a technician guy only, and I knew you have more several things coming together to make a huge success.

But deep in my heard HiFi isn’t dead, and it brakes something my heart I can’t show the world the possibilities of this framework, based on my limited rest time working as a pipecleaner for other corps…

So sorry about that…