AR Applications for HiFi


Folks, I haven’t seen any comments on my post in the Continuity thread, so I am guessing it didn’t draw much attention, but it is such an important point, so here I’ve created a new topic just for this alone.

It seems likely that whichever virtual environment is first to the post in the implementation of an AR point cloud (See my post in the Continuity thread, linked above), will win a race in which all other players are wiped out. I believe it is that powerful. We can see AR and VR technologies are converging. As explained, HiFi appears to be currently the technically best solution, assuming it can be equipped with continuous options. I personally am anxious to see that the best technical solution wins. We have all seen in the past, cases of the technically best in all kinds of things being beaten into extinction by market forces, usually to the detriment of users. Hopefully all can see also that AR applications are the real world enablers, so this is where the real money is. After AR is empowered with the beginnings of an AR point cloud, and headsets get smaller and lighter, AR users will see all kinds of useful, attractive, creative objects superimposed in the real world wherever they go. They will be wearing these things every waking hour, even in bed. They will never want to take them off. No more need for physical displays. Can you imagine driving your car along a winding hilly road, and you, and everyone else, seeing the road ahead snaking through the hillside, as if you can see right through the hills, night or day? Or sitting in a real world place like a railway platform, watching a 3D character animated dance performance, seemingly interacting with real people, right there on the platform?

Further, using a HiFi avatar in the pure VR space, you could virtually visit a real world meeting, and sit at the real world table, participating in the meeting, with everyone seeing you, and hearing you, as if you were there.

Virtual objects created for the real world would have real world value, not just virtual value, like the objects created only for a virtual world. There will likely come a time when cities will opt for virtual only festive season decorations, which only appear to those who are AR equipped, because it will be capable of fantastical displays not possible in the physical world, and it will still be cheaper, probably community created. It would no longer be necessary to have street lights, or building lights, as everything could be virtually illuminated.

We could choose to exclude ourselves from this new AR enabled world, or become part of it, even instrumental to it. I know which I choose. Which do you you choose? HiFi is leading in the race of pure VR environments. All it needs to switch lanes, and take the lead also in the race of AR compatible VR environments, is to add continuity options, as described in my post there, so as to make it ready for the AR point cloud.

If you are inspired, think of more examples of AR applications that you think would be useful in the real world, and post here in the replies. I suspect (And hope!) this list will grow very long, very fast.

Seeking Collaboration on a Mixed Reality Citizen UAV PhD Research Project

If I try to imagine what could be a meeting in AR, I see other avatars in my RL location, and these other people will see my avatar in their own RL location. I can probably render entities that other will be able to see, which would be probably run in the server of the hoster of the current “conference”.

I think that HF is at a few things able to do this.

What I don’t see happening is if you want me and you in the same RL location (me being far away) That would require something to push me the representation of your RL environment. (This I think it might take time before seeing this, even for the “competitors”) People might instead meet in a virtual environment (or a scan of a RL location, but probably not synchronized with the RL before a long time.)


@Alezia.Kurdis Alezia, Imagine this being done already by the AR point cloud. A virtual version of the meeting environment would already exist in HiFi in the AR point cloud covering that location. The AR headsets of the other folks at the meeting would automatically update and/or add detail to the AR point cloud as necessary, in order for you to see the meeting location in sufficient detail in the VR space. The technology needed to rez moving figures at the meeting, for you to see in the VR space in real time is still subject of research (Imagine realtime markerless mocap, and avatar skinning), but the technology needed to instantly capture and display a detailed scan of the static objects in the meeting room is already there, see BundleFusion for example (My own PhD project uses something like this, and depends on it).


I don’t really think AR is relevant to HiFi.
AR is about staying in the real world but enhancing it.
HiFi is about visiting entirely different new worlds, completely separate from the real world. The program isn’t really designed around mixing the two together.

I definitely agree that AR will be very useful in all sorts of ways in the future. I just don’t think HiFi is the right app for it.


I think you will be able to reach HF from an AR headset, but to do VR. and even render it partially if you want. (as playing Minecraft in AR)

Outside of that, I don’t see how it can be AR and have any advantage.
as the representation or your RL to any other participant will always be Virtual (unless you are both in the same location… but in that case using HF would be pointless… as using a hammer to screw a bolt.)


@Theanine, Theanine, Your feeling that HiFi is not applicable to AR is understandable. As mentioned in my initial post in the continuity thread Continuity in virtual spaces, that view is probably the same as many existing HiFi users, but I’ve provided my reasoning based on research, for concluding that with the advent of the concept of the AR point cloud, AR and VR are converging, and HiFi is head and shoulders ahead of anything else out there.

I would not call HiFi an “app”, it is more of a System, a distributed system of servers and clients. It is this distributed nature and the infinite open scaleability that such an architecture enables, which makes it uniquely suited to hosting the AR point cloud, that is in turn the enabler for AR, the holy grail currently being sought by AR companies.

The biggest technical and commercial difficulty with implementation of this by a single company is that development of a distributed open infinitely scaleable system such as HiFi is counter to the usual corporate mentality that all things have to be closed and proprietary, so as to be able to extract maximum revenue from it.

This open distributed kind of architecture can only be done from the ground up, enlisting the help of developers from around the world on an open source basis, in the same way as the crypto currencies have done. Somehow @philip / HiFi has achieved this, and I believe HiFi is the only VR capable environment empowered by this architecture. This also explains why HiFi is uniquely suited to a crypto currency based economy. If HiFi doesn’t corner this market, others will follow, just much later. But then, if they have the capabilities I’ve mentioned (i.e. Used for both AR and VR), with continuity, and have real world crypto cash to offer for virtual items, how relevant will HiFi be if it can only do VR?

As said, the only fundamental requirement missing, is continuity. Implementation of this enables the system for the AR point cloud. Once that is done, then folks like me, who are planning directly or indirectly to enable methods of gathering point cloud data from around the world, can get on with that task.

@Alezia.Kurdis Alezia, the trick with AR, again enabled by the AR point cloud, is to hide those things in the AR space which are mirrored from the real world to the virtual world, i.e. if they are the same, don’t show them in AR mode. An exception to this might be if they are normally obscured from the user’s view in the real world, like in the roads seen through hills example I gave earlier.

The AR point cloud would never be for display in the real world, (unless perhaps for physical reconstruction purposes), because it replicates the real world, it is just meant as a reference for other objects created and appropriately positioned in the VR space. However the AR point cloud would at the same time be fully visible in VR only mode, allowing us to explore and operate around it in VR only mode, so as to be able to create and place those virtual items we wish to display in the real world. This way, creators could explore the point cloud anywhere in the VR world, identify opportunities for particularly beneficial virtual features, create the appropriate items, and place them there, all done within VR without need to physically visit the real world location. These could also conceivably be licensed for display, depending on those AR users wishing to pay to see the feature.

Because such items would only exist in the VR space, they would be automatically displayed in the AR space. Thus only virtual items intended for display are shown in the real world. This conversely also enables the system to identify objects which have changed in the real world. As the author of one of the PracticalVR articles points out (See the links in my post on continuity), such things as litter could be flagged. People could conceivably receive automatic crypto rewards for correcting things, either in the AR space, say by physically collecting the litter, or in the VR space, by altering the AR point cloud.

Before folks launch a barrage of “But what ifs”, there would be much development needed to iron out the loopholes here, the devil is always in the detail! But hopefully you can see the overall concept of the AR point cloud, how it is a real game changer (Pun intended!) and how it could be relatively easily implemented in HiFi, assuming HiFi can be given some fundamental options of continuity.


Further to some of the discussions seen previously on identity and privacy concerns in other threads, The AR show/hide logic could possibly apply also to identities. Users with real world identities corresponding to both the real world and virtual world, indicating they are real world people, could be automatically hidden by default, as they are the same in both the real world, and the VR world, thus addressing some privacy concerns. Whereas fictitious users created for role-play could be automatically visible in the real world, as they are virtual only characters, no privacy concerns, beyond perhaps anonymity that might be desired by the creator.


Some players in the AR Point Cloud:

Don’t just take my word for it that VR and AR are converging, have a look at that of some others (spot Philip in the second one):

The next one is a year old now. Look out for the statement “A similar issue exists for augmented reality glasses. Looking through AR glasses at the real world, we would expect that information is mapped to the surroundings, something that is generally not possible today. This will be an important development for upcoming AR devices. The driver behind this development could be targeted advertising, for example.”

As we can see, less than a year later, the AR point cloud is the answer. This provides the missing reference for the surroundings in VR form, to which the information in the AR glasses is mapped.

Should HiFi just ignore the AR/VR convergence?


I always wondered if there was a quick way to map the room you’re in with the roomscale on vive, and present it through the front camera. It seems like the hardware is there, but use case is still foveated… er blurry.


It would be interesting if we could create “Mixed Reality Zones” in HiFi :grinning:
Someone would just need to connect to a domain with a device that could give the user’s location in the real world (with enough precision). Once in this kind of zone, HiFi could use other more suitable protocols (like point cloud, but also those of cloud gaming streaming).
This would allow cities, or remarkable places whose 3D models already exist, to be available any time and anywhere (not just behind a computer). If and only if there are enough AR users… Then we won’t even ask ourselves about the question of the benefits of such feature.

However, I doubt the idea of the “No more need for physical displays” comes true, since there are already very precise holographic technologies such as Holoverse by Euclideon.
I’m not saying that we’ll soon have premium design glasses to see these holograms (they are needed to filter the light according to the user’s position), but they will inevitably compete with the increasing reality devices.

And a further fundamental question arises : Will people accept that their lives depend almost entirely on VR/AR ?
We can see in PSYCHO-PASS that people accept holograms as the only light source indoors to the extent that it becomes a de facto normality standard … but this story is supposed to happen in the next century. And let’s not talk about people who think they are being forced to change their devices too often.
On a side note, it is also interesting to see that in Psycho Pass, VR is used at home when you want to stay anonymous (for the better or the worse).


@whyroc whyroc, I don’t have a Vive, but reading up on it a little (You’ve probably seen this already), I am guessing you have the pair of Lighthouse IR/lazer boxes. Assuming the lazers are rangefinders, I guess these boxes could make a rough map of the room, but as they are static, there would likely be a lot of detail missed. The space you have to play in presumably only needs to be represented internally as a box with roughly the same dimensions as the room.

A single structure sensor, mounted on the headset, with something like BundleFusion (See link in reply to Alezia above) processing the signals from the sensor, would scan the room as you move around, building up a fully rendered mesh of the room in seconds. If the storage for this was set up as an asset in your HiFi sandbox, referenced around the initial position of your av, then in theory, in first person view, you should see the room form around you, as you move around.

Please don’t read this as simple, I suspect getting the structure sensor to work with the BundleFusion software will require some work, but I know it is possible, as this is the sensor used on the BundleFusion project. I still have to carry out a similar task myself as part of my own project. If not published already, I will post the details, but I admit I’d rather be quoting the details!

@ZeFish ZeFish, For sure it would be fascinating, looks like whyroc is interested also. I think your “Mixed Reality Zone” corresponds to the AR Point Cloud layer term I’ve been using, and yes, all that would be needed to position a real world AR user relative to the AR point cloud is an accurate fix of their RL position. In the presence of an AR Point Cloud, the fix could be refined simply by the AR glasses having appropriate sensors to compare the sensed environment with it’s AR point cloud representation.

I guess it would be possible to have both gaming and real world versions of these “World” scale layers, with the real world PointCloud being optionally visible also in games layers. They would need to be kept well separated, to prevent game action leaking to other AR users in the real world, some of whom I guess would be pretty terrorised if they unexpectedly found themselves in the middle of a virtual war zone, or worse :slight_smile:

On display technology, things like Euclideon’s Holoverse are pretty impressive (Thanks for the link), but awful expensive as far I can tell, and limited to a small fixed volume, whilst AR glasses will reproduce the full holographic effect everywhere that the user goes, at most likely a fraction of the cost, so my money is on them eventually making everything else obsolete.

On whether or not folks will accept becoming dependent on VR/AR, my own opinion is that everyone will be drawn in when they see how much it enriches the lives of other folks using it.

Thanks for the link also to Psycho-pass, seems worth watching. I am also curious also to see how they deal with anonymity.


I can’t wait for crowd sourced pointcloud so I can get a hololens headset and finally become a gargoyle!


@ZeFish I watched that episode of psycho-pass, I didn’t realise it was a series… now I am gonna have to watch all of them :slight_smile:


Yeah ! Psycho-Pass is truly visionary in the sense that many of the things described in it are already happening in some parts of the world (the movie after the 2nd season better explains the context). And we really see that it’s about our world in a near future closer than we might think, when we see reference to Stackexchange for example. Some people find it too dark, but when you don’t let yourself be too impressed, it becomes quite easy to understand the messages the author tried to convey.


OK so I watched all of the Psycho-Pass episodes, and can see why some folks might call the series dark. But maybe it is already outdated if we think of the open source communities who are already up and running with their own community oriented versions of of this kind of functionality (Minus Sybil!) I have a fair amount of faith that the crowd will reach these kinds of solutions long before the corporates even become aware of their requirements.


Merry Xmas everybody. Some news; I am no longer in business for profit. I need your support. It’s a worthy cause: “Help a new Crowd sourced VR enabled AR initiative. Give to the Worldwide VRENAR Angels Community Raise funds on @generosity


In response to feedback from others on the VRENAR Community campaign, I’ve updated the graphics and text to clarify the project purpose, and remove societal analyses and comments (Those were originally used, based on techniques of persuasion recommended in a certain book, which I won’t name here, but it is clear now would have been more likely to make the campaign “Lose Bigly”, rather than the other.) :wink: Would appreciate queries and/or feedback on the campaign from any/all in HF. I believe we all want the same things in the end. Let’s work together to get them!


I don’t see too much the AR part in what you illustrate. The use cases are mostly the VR ones that AR gearing is also able. I don’t see too much where you said that there is a problem.

You put VR on one side, and AR on the other… But in fact, AR is able to do VR so… ???

Maybe I don’t see what you are trying to demonstrate (other than solicitation).


Hi Alexia thanks for looking, and for your question. I believe AR generally cannot host avatars, only static or scripted objects. I believe VR technology needs to be added to AR to enable it to be interoperable like a virtual world, then avatars and objects can be generated from the real world scene scanned by multiple AR glasses. Correct me if I am wrong please anyone…


Sorry but I just need to joke a little about the use of the AR back in another VR world it was Abuse Reports.