Crash caused by GLSL version


#1

Another startup crash related to GLSL appears to have been introduced in latest GIT. Here is the full log. The relevant part appears to be this:

[DEBUG] [08/02 22:43:24] GLShader::compileProgram - One of the shaders of the program is not compiled?
[DEBUG] [08/02 22:43:24] GLShader::compileShader - failed to compile the gl shader object:
[DEBUG] [08/02 22:43:24] 0:1(10): error: the compatibility profile is not supported
0:1(10): error: GLSL 4.30 is not supported. Supported versions are: 1.10, 1.20, 1.30, 1.00 ES, and 3.00 ES

[DEBUG] [08/02 22:43:24] GLShader::compileProgram - One of the shaders of the program is not compiled?
Segmentation fault

If that’s by chance relevant, I use the Mesa driver since I’m on Linux, which doesn’t support OpenGL 4 yet, only up to 3. Do I need to enable a special setting for that, or hack any part of the code as a workaround?


#2

Yeah… we’re on our last few days of being able to find quick workarounds me thinks: https://github.com/highfidelity/hifi/blob/master/libraries/gpu/src/gpu/Config.slh#L26

It’s strange – at least the vocal Linux users seem to be running not-the-latest-gaming rigs, and yet these changes keep coming in as-if there are other types of Linux users (which makes me wonder who/where they are hanging out?).

I tried going back in time a little; it wasn’t enough. If it helps here is the git hash point and a gpu diff of what I’m currently working with (which includes some extraneous changes I was trying, like logging the shader source on errors):

$ git describe
stable-5042-gb46a430
$ git diff libraries/gpu/
diff --git a/libraries/gpu/src/gpu/Config.slh b/libraries/gpu/src/gpu/Config.slh
index f24b54e..4aaf7c5 100644
--- a/libraries/gpu/src/gpu/Config.slh
+++ b/libraries/gpu/src/gpu/Config.slh
@@ -11,7 +11,7 @@
 <@if not GPU_CONFIG_SLH@>
 <@def GPU_CONFIG_SLH@>

-<@if GLPROFILE == PC_GL @>
+<@if GLPROFILE == xPC_GL @>
     <@def GPU_FEATURE_PROFILE GPU_CORE@>
     <@def GPU_TRANSFORM_PROFILE GPU_CORE@>
     <@def VERSION_HEADER #version 430 compatibility@>
@@ -21,9 +21,11 @@
     <@def VERSION_HEADER #version 120
 #extension GL_EXT_gpu_shader4 : enable@>
 <@else@>
-    <@def GPU_FEATURE_PROFILE GPU_CORE@>
-    <@def GPU_TRANSFORM_PROFILE GPU_CORE@>
-    <@def VERSION_HEADER #version 430 compatibility@>
+    <@def GPU_FEATURE_PROFILE GPU_LEGACY@>
+    <@def GPU_TRANSFORM_PROFILE GPU_LEGACY@>
+    <@def VERSION_HEADER #version 120
+#define gl_VertexID 0@>
 <@endif@>

 <@endif@>
+    <@def GPU_TRANSFORM_PROFILE GPU_LEGACY@>
diff --git a/libraries/gpu/src/gpu/GLBackendShader.cpp b/libraries/gpu/src/gpu/GLBackendShader.cpp
index 9c0cf76..1f00802 100755
--- a/libraries/gpu/src/gpu/GLBackendShader.cpp
+++ b/libraries/gpu/src/gpu/GLBackendShader.cpp
@@ -177,13 +177,7 @@ GLBackend::GLShader* compileShader(const Shader& shader) {
     // if compilation fails
     if (!compiled) {
         // save the source code to a temp file so we can debug easily
-       /* std::ofstream filestream;
-        filestream.open("debugshader.glsl");
-        if (filestream.is_open()) {
-            filestream << shaderSource->source;
-            filestream.close();
-        }
-        */
+        fprintf(stderr, "%s", shaderSource.c_str());

         GLint infoLength = 0;
         glGetShaderiv(glshader, GL_INFO_LOG_LENGTH, &infoLength);
diff --git a/libraries/gpu/src/gpu/GPUConfig.h b/libraries/gpu/src/gpu/GPUConfig.h
index 5590c4c..39e2d43 100644
--- a/libraries/gpu/src/gpu/GPUConfig.h
+++ b/libraries/gpu/src/gpu/GPUConfig.h
@@ -37,8 +37,10 @@

#3

Am a bit worried. sicne windows 10, linux got much more intressting. But not sure if high fidelity is going to run good with my nvidia 670 on linux.

Linden Lab made the same mistake by not supporting linux good, and this days it got worse.
And what the do in the future ? So i don’t hope high fidelity is making the same mistake.

Still not sure if i want linux. but windows 10 is headin into the wrong direction


#4

I run Interface on Linux and, so far, have been able to continue doing so with minimal effort. But, I use an Nvidia GTX660M equipped Alienware laptop and use Nvidia’s proprietary (closed source binary) driver and stay on bleeding edge of Ubuntu distro + install packages outside mainline.


#5

We’re currently doing work to unify the rendering pipeline across all platforms. The result is that our targeted minimum version of OpenGL is very shortly going to be 4.1 (Core profile).

Consequently this means that the Linux machines will need to have supported 3D hardware and a driver that supports OpenGL 4.1. Mesa software rendering will not be sufficient. If you have an nVidia or AMD/ATI graphics (and I believe Intel HD 4000 and higher) you should be able to find display drivers that will support OpenGL 4.1.

Similarly, some older macbooks (see this chart) will not be able to run the application. Probably anything from the Mid 2012 refresh onwards should be fine.


#6

I seriously don’t know what to say to that. You’re going to ask every Linux user to install a proprietary driver on their OS, just to make a cutting edge OpenGL version a requirement? Considering how experienced the Hifi developers are, I did not suspect such a thing… this is a great mistake if it will happen.

The team is basically shutting off everyone using the normal drivers provided by their distribution. Many people who run modern games on Linux (such as myself) stick with Mesa. Both because we don’t want a proprietary driver in our system, and because that driver has more bugs and is more unstable on Linux (I used it in the past and know this). Also consider people who have older machines, and could normally run High Fidelity well without this requirement.

Can the developers please look into this situation, and do something for Linux users without the ATI / Nvidia drivers? Making OpenGL 4 a requirement at this day is absurd… if anything, shaders that use it should be disabled if the hardware doesn’t support it.


#7

Linux users with 3D gameing and not using proprietary driver OCH, that’s not smart.
without proprietary driver you missing with nvidia lot’s of things and especially speed.

Always install proprietary driver under linux if you need speed,

Bigger propblem is that there are no pre compiled versions for linux. with the result i betetr stop looking at linux. because missing dependencies.


#8

I use an ATI card, not Nvidia. The performance and compatibility of the open-source driver has greatly improved during the last few years, and is almost close to that of the proprietary driver (although noticeably slower). I heard that Nvidia users are less lucky, and Nouveau doesn’t offer the same benefit as Nvidia’s own driver… at least AMD owners can safely use the default Linux libs for gaming though.

The idea that free video drivers can’t run games is mostly a leftover stereotype. I uninstalled fglrx over an year ago, and play modern 3D games through Mesa ever since… including Second Life with all shaders and graphical settings at maximum (shadows, depth of field, etc). I don’t believe we should be disregarded, and forced to either use a driver we don’t want and which is also buggy, or abandon High Fidelity altogether.


#9

Ok. yes the Nouveau is something you want to drop asap with nvidia. as example with the open-source driver i cannot enable advanced lightning in Secondlife and ambient occlusion etc.


#10

| You’re going to ask every Linux user to install a proprietary driver on their OS, just to make a cutting edge OpenGL version a requirement?

Not exactly.

OpenGL 4.1 is not exactly cutting edge, being just over 5 years old. We might have targeted something newer but it’s the most recent version of OpenGL supported by OSX across a broad range of hardware.

Further, whether you choose to purchase hardware from a vendor that only provides closed source binary drivers that support modern OpenGL is entirely up to you. Intel’s Linux graphics drivers are completely open source I believe.

While we do obviously support Open Source, making design decisions to hobble our rendering pipeline simply because there are insufficient open implementations of our desired target API might be taking it a bit far.


#11

Mesa isn’t an obscure library which I’m asking the developers to support only for me. It’s the default OpenGL renderer for every Linux distribution, and a good part of those using Linux and ATI stick to it… sometimes because we prefer to, at other times because we have to (the proprietary driver doesn’t work properly). By ignoring Mesa support, myself and many users are being indirectly “banned” from the project unless we change our operating system or the drivers on our machines, despite having drivers capable of running any other modern 3D game. Sure, I agree that Mesa has the biggest fault for not supporting OpenGL 4 to this day… the problem is what’s the solution.

I’m tempted to ask whether anyone from the High Fidelity team might be willing to work with the Mesa developers to get OGL 4.1 working soon. This might be a bit silly since they’re completely different projects though, but maybe some collaboration might be possible.

The best solution might perhaps be to have two rendering pipelines. I know of game engines that can work with both OpenGL 1.x 2.x and 3.x, because different shaders and features can be toggled independently so you only need OGL * if you’re enabling a shader that uses it. Why not make each shader only require the OpenGL version of the extensions it’s using and make each non-vital shader optional?


#12

I haven’t tried Mesa, I don’t think, I’ve not tried doing anything involving graphics on Linux for years. I’ve had the most success with a R9 Radeon using fglrx-updates, the default was “xserver”.
Rendering was very good mostly, maybe primary pass, shadows/effects did not work as expected.


#13

I spoke with the Mesa crew for a bit about this, in #dri-devel on irc.freenode.org. There’s apparently some good news and some bad news.

Good news is that, Mesa 11.0 (currently GIT master) might officially support OpenGL 4.1 relatively soon. For Radeon 6xxx cards this might be a slower process since they couldn’t solve some technical issues.

Bad news is, High Fidelity also seems to require something called OpenGL compatibility profiles. Mesa only supports core profiles, and there are apparently no plans to allow compatibility ones yet.


#14

No, we’re currently moving entirely to the core profile in the very near future. In fact we have to move to core profile because Mac OpenGL support is somewhat similar. If you want a modern OpenGL context, you can only use the core profile. We’ve been going through the codebase over the past few months removing legacy OpenGL code and are hopefully in the final stretch to dumping compatibility mode.

You can see the remaining work going on in the Core branch of the github repository.


#15

Ah… that is good. I heard Mesa might never support compatibility profiles, so that’s one worry out of the way.

Otherwise, I was just amazed to discover that after years of struggle, Mesa seemingly finalized OpenGL 4.0 support about… one week ago. Which is also when High Fidelity introduced the 4.3 requirement. Out of all coincidences and perfect timings…

https://www.phoronix.com/scan.php?page=news_item&px=Mesa-Now-OpenGL-4.0

So from what I understand, I need to wait a few months until a Mesa 11.0 package is released for my distribution. Compiling Mesa from GIT stopped working for me again, otherwise I would be able to test this right now. It’s also not certain whether the R600 driver works with it yet, as some suggested there’s still work to do for this series… we will see when I can get back to running Hifi in this case.


#16

How about letting us have a crack in advance at finding and presenting compatibility options then?

Common paths around the pipeline changes definitely exist, so I think needs could be addressed in parallel on both sides and with minimal disruption to your flow.

Also another opportunity the 4.1 mandate sacrifices is the ability to run Interface from within a Linux VM. I’ve done this and it’s easily the lightest-weight automated testing rig for your type of VR application – no code changes required to drive the app and functionality can be tested as-if by a human.


#17

With this migration to core profile, the last of the raw OpenGL code is being removed from the codebase, and all rendering will now be done through our GPU library. Currently we have a GL backend, which we’re in the progress of migrating to work on the GL 4.1 core profile. However, the library is designed and intended to support multiple backends (like say, Vulkan).

Anyone is free to tackle producing an alternative backend, be it OpenGL 3.3, Direct X 11 or Unity. The biggest impediment is that you’d have to re-write all the shaders to a lesser or greater degree. Still, it probably wouldn’t be too hard to come up with something that could outperform our current backend in some respects. However, we’re still in the process of optimizing and now that we’re moving fully to Core the optimization task should go significantly faster.

While I take your point about the ability to run in a VM, and would like to be able to apply some automated testing of the rendering engine, it’s not yet on our radar. As Malcolm Reynolds said: Can’t miss a place you’ve never been.


#18

I think a question that shouldn’t be missed is: Which shaders require which OpenGL 4.x extensions exactly, and can these shaders be made optional? The OpenGL version is just a number which indicates whether the system supports a given set of GLSL extensions… so if it’s 3.3 it supports gl_foo, if it’s 4.0 you can also use gl_bar, and so on. This is why most engines gray out specific graphical effects if you have a video card or driver with an old GLSL, but you can still play the game and use those shaders supported by your card.

So does High Fidelity require any OpenGL 4.1 extensions for rendering to work at all (not get blank screen) or does it only require them for specific graphical effects (like say reflections)? Because if it’s the later, the individual shaders that need them can simply be disabled. It’s after all better to run High Fidelity without tessellation / photorealistic lighting / other than not be able to run it at all, as much as I love shaders and visual effects myself.


#19

OK, I think I understand where you are coming from – thanks for clarifying.

Since the GPU Library doesn’t appear to be a git submodule or similar, any advice as to actually alternating just the GPU backend? Currently the best option seems to be forking the whole repo (like you did way back when with OculusSDK) and applying community changes on top of that.


#20

I’m still curious about my earlier question: Why can’t shaders or shader functions that rely on OpenGL 4.1 directives be made optional, like most engines do to my knowledge?