Building optimal for Oculus Quest

Hi

Anyone know what would be optimal for building something that will be acceptable for an oculus quest user ?

What’s the acceptable limit in number of polygon?
What weight in texture it can manage?

2 Likes

I’ve started soon to experiment with Oculus Quest development on High Fidelity.
I’ve succeeded to create a collaborative space, like a virtual office, that looks quite nice.
Unfortunately, we are still limited on what is possible to develop on this platform.
Because we don’t have web entity and web overlays support, we can’t display slides, websites and video.
I’ve tried to create a slide display based on images or on changing the textures, but it not work.
The avatar override actions for sitting, not work.

Quest limitations on building worlds:

Players in Room: ~10
Poly Count: 50,000 or less
World Size: 20 MB or less
Draw calls: 50 or less
lights are not supported
shadows are not working
shaders are not supported

Recommended Quest versions of avatars:

Poly Count: 5K or less
Material Count: 1
Draw calls: 3 or less
Bone Count: 66 (Standard Humanoid)
Flow script, cloth, are not supported at all on Quest.

Would be nice to share here all your experiences with Quest and what is working and what is not working (yet).

4 Likes

Considering that High Fidelity’s tablet system already has a staggering amount of drawcalls, that alone eats up half of the anticipated 50 draw calls developers are limited to. Reported it ages ago, even brought it up during a meeting but… eh.

I would say the easiest way High Fidelity can make the Quest work is to update their FST format to include a lowmodelurl as an alternative model URL (and lowtextureurl as well). That way, you can have PC and Quest systems in the same environment without too much difficulty. (seriously, that’s like a 3 line addition to the FST reader)

I guess I should submit that on Cann- Oh wait.

1 Like

Can you define what is a ‘draw call’?

Basically a draw call contains all the information telling GPU about textures, states, shaders, rendering objects, buffers, etc. encapsulated as CPU work that prepares drawing resources for the graphics card. Converting state vectors (all the information mentioned before) to hardware commands for the GPU is very “expensive” for the CPU and API complexity becomes API overhead that does not help.

Since a draw call is required for each different material, having a variety of unique objects in a game and multiple different materials, the number of draw calls is raised accordingly. Since CPU work to translate this information to GPU hardware commands takes time, sometimes we see CPUs bottlenecking GPUs exactly with a high number of draw calls involved.

Perfectly balanced CPU and GPU work resulting in frame times of 16.66ms which equals 60 frames per second

For this purpose we can take the above illustration of how CPU and GPU work are related in an ideal situation where CPU performance ideally matches GPU performance and is perfectly aligned for 60 FPS rendering. As you can see, CPU work is always done ahead of GPU for each of the 3 rendered frames on our basic rendering loop timeline. CPU completes its work for the first rendering frame, while GPU renders that frame, CPU works on the next frame (16.66ms to 33.33ms), then CPU works on the third frame from 33.33 to 49,99ms and so on.

Red square(ish) things are time slots that CPU uses for each of the four draw calls per frame. We can think of the illustration as DirectX 11 situation, but with ideal conditions and no bottlenecks since both CPU and GPU just complete their work in a timely manner.

However, this is almost never the case and our next illustration might give us a better clue on how things sometimes really look like:

A not so perfect situation where CPU is too slow and bottlenecks the GPU that actually can handle 16.66 ms frames (60 FPS) but must wait on the CPU instead

Here we can see a situation where we have a very slow CPU that can’t really prepare all the material for GPU on time. CPU work with four of our draw calls takes 21.34 milliseconds per frame. Our GPU is fast enough for 16.66 ms frames (60 FPS per second or it may even be faster), but since it must wait for CPU it remains idle after it renders the frame it actually does have data for. Our CPU takes 23.45 ms per frame which — if we imagine our GPU is exactly as fast as needed for 16.66 ms frames or 60 FPS — makes our GPU usage drop to 78% which is effectively CPU bottlenecking.

CPU bottlenecking can also happen for various other reasons like time CPU needs to process physics, load things, process some game mechanics, network data or AI. All the work that does not fall under the draw call umbrella, but is a CPU thing. In all cases when we have a CPU that is not completing those tasks fast enough, we experience CPU bottlenecking that makes our GPUs wait and sit idle, which is seen as lower than maximum GPU usage.

* Cited from: [Tonči Jukić](https://medium.com/@toncijukic)
  https://medium.com/@toncijukic/draw-calls-in-a-nutshell-597330a85381
1 Like

Hey, George. Ignore the negativity and go visit Music. The whole building is one draw call, thanks, to @Judas, and we had 11 dense avatars there without problem.

Of course, we can’t yet try it on the Quest. But building for it is very possible.

1 Like

Actually, I am confident that we can build for Quest, I’ve just built a nice office space, that loads and run smoothly, but I need to figure out how to implement different tools for collaborative work without having web entities or overlays. I will try to see if animated fbx can be used for slide presentation. I’ve already tried images and texture changing without success.

2 Likes

@DrFran, this has nothing to do with negativity and you missed the point entirely. The issue is as George stated: The relied on systems are no longer working on the Oculus Quest, rendering the concept he had prepared no longer useful.

The other mentions of material limits and what not are observations from other platforms to act as a form of a guide, not a form of negative criticism. 2000 polygons was a limit for Quake 2 player models (with the character and gun models both being 2000 poly each as the maximum). This was not negative criticism for the game; it was just a limit of the engine.

That being said, I’m shocked that texture/image changes weren’t working. That sounds like a bug and a half, since that isn’t a plugin based system like WebEngine or the Overlay system. Considering overlays aren’t working, that rules out being able to cheat and creating a WebEngine overlay ourselves. I’m not sure what compatibility there is with WebEngine in Android via Qt, but I guess that’s what the Android app is for to compare with (and honestly, I don’t remember if they worked or not).

Sorry I am not as smart as you, but I am glad I am way more optimistic.

Finally, I’ve managed to create a fully functional Quest collaborative space, with nice PBR materials and a slide presenter. Because we don’t have overlays, I’ve updated the sitting script accordingly. You can visit the domain on: hifi://smartfactory
You must use on Quest this Beta version of HiFi:
https://builds.highfidelity.com/HighFidelity-Quest-Beta-10621-9b1c7b3.apk

2 Likes

You can use this build for windows Interface and Sandbox for Quest:
Windows is https://builds.highfidelity.com/HighFidelity-Beta-10621-9b1c7b3.exe

1 Like

Hi George,

thanks for showing me around your Quest space. things are progressing. We should do a crash test at your domain! To see how many it will hold :grinning:

Can you run it side by side? So that i dont mess up my current windows
install?

Yes, I think it will be installed on a different folder. You can also choose custom install and on path rename the folder from High Fidelity to HighFidelity Quest, for example.

Visited your quest domain , there was no one home :stuck_out_tongue:

1 Like

I’ve found a method to share a screen to HiFi for Oculus Quest.
It is based on a screen capture / image uploading on a ftp server and a script for reloading the image.
For capturing and automatic timed upload, image processing etc, you can use ShareX, a free application: https://getsharex.com/
You can download my configuration file for this application from here:
https://transmissiongate.com/hifi/quest/ShareX-conf.sxb

To import the settings go to Application Settings/Export Import,
click on Import and select the file.
You can change the ftp / sftp connection on Destinations / Destination Settings /
FTP FTPS SFTP, add your server and credentials.
There are a lot of other nice tools implemented, for image size, crop, watermark,
adjustment etc.
You can grab the entire screen or a specific area.
To start the sharing, go to Capture/Auto Capture, configure the frequency
and press Start.

In High Fidelity add an image entity and place this script on it:
https://transmissiongate.com/hifi/office/pquestshare.js
You can adjust on this script the URL variable to point to your server/path where the snapshot is uploaded.
Your screen will be grabbed and uploaded each n seconds. To update the image screen on HiFi you will need to click on the image. In this way, only when the slide is changed you will update the screen, to not stress the Quest,

Happy presentations on Quest!

For sitting on quest, you can use this script:
https://transmissiongate.com/hifi/sitb.js
Place it on a hidden collisionless box that will overlap your chair.
On the same box place this user data:

{
  "grabbableKey": {
    "grabbable": false,
    "triggerable": true
  },
  "seat": {
    "offset": {
      "y": -0.02
    }
  }
}

Adjust the position of the box and the offset variable.
When an avatar enter in this box, it will go on sitting position.
To unsit move forward or backward.

Because, QTWebEngine is not available on mobile platforms, where policy dictates that all web content is displayed using the operating system’s web view, can be used instead, Qt WebView, wich provide a way to display web content in a QML application without necessarily including a full web browser stack, by using native API’s where it make sense.
https://doc.qt.io/qt-5/qml-qtwebview-webview.html
https://doc.qt.io/qt-5/qtwebview-minibrowser-example.html#

WebEngine still isn’t mobile compatible? Huh…

I wonder if it can’t be forced. I mean, I was able to compile Qt for the Raspberry Pi and that ran just fine.

Even then, a big issue would be ensuring compatibility between webview and webengine’s parameters, which the server needs so it can communicate this between clients, namely the entity parameters. I’m guessing it wouldn’t be too hard, but it is always something to think about.

Another thing is if a webview can be used on the Quest, since it sounds like it’s based on whatever the mobile system has to offer, assuming it’s even there.

That’ll be an interesting PR, but hopefully one that just requires changing webengine to webview for web entity calls on the Quest version.