Support for procedurally rendered spheres and boxes added


#1

Update:

There is now a PR for V2 of procedural entity support here. V1 shaders will continue to be supported, but some of the limitations of V1 are removed in V2. For full details see my post on V2 changes further down.

An example of userData for a V2 procedural surface can be found here and an example V2 shader can be found here.

Original:

A PR was merged this morning that added support for using shaders to render box and sphere entities procedurally rendered. I’ve created a small tutorial video here

I’ll try to add some more documentation on what’s available for use in the shader. For now, the video should allow interested developers to get started with some procedural surfaces. I also have a file here that lists the currently available inputs and demonstrates a simple shader that flashes blue, similar to the video.


#2

This is really interesting stuff.
A few questions from the coding noob

I had a look at the shadertoy website, I’m guessing these pieces of code wont work directly in interface at the moment. is the eventual goal that they do?.

Maybe you could post up a few links to ones that do?

I got your hexagonal floor working "
{
    “ProceduralEntity”: {
        “shaderUrl”: “https://dl.dropboxusercontent.com/u/10483952/shaders/hex.fs
    }
}"
that bit of code in the cube entity’s user data box
and the link in my dropbox.

Maybe we could post examples of working shaders in here for us all to mess around with


#3

I’d like to get much closer than we are now. In fact I made some progress on that last night.

Take the following shader…

vec2 iResolution = iWorldScale.xz; 
vec2 iMouse = vec2(0);

////////////////////////////////////////////////////////////////////////////////////
// REPLACE BELOW
////////////////////////////////////////////////////////////////////////////////////

void mainImage( out vec4 fragColor, in vec2 fragCoord ) {
    fragColor = vec4(0, 0, 1, 1);
}

////////////////////////////////////////////////////////////////////////////////////
// REPLACE ABOVE
////////////////////////////////////////////////////////////////////////////////////


vec4 getProceduralColor() {
    vec2 position = _position.xz;
    position += 0.5;
    position.y = 1.0 - position.y;
    
    vec4 result;
    mainImage(result, position * iWorldScale.xz);
    return result;
}

If you replace the contents of the marked section with something from shadertoy it should result in that shader being rendered on the top (and bottom) surfaces of a box. Doing that is how I was able to get this image:

https://alphas-new.highfidelity.io/uploads/highfidelity/4426/5b471140db6c4660.jpg

and make this video:

There are a couple caveats though.

  • Shadertoys that use textures, video, or audio won’t work. They’ll complain about iChannel0iChannel3. There’s no support for this right now.
  • If a shader is using textures only for noise, then it’s usually possible to adapt the shader to use one of the noise functions we’re providing instead. Takes a bit of debugging though.
  • The word smooth is a keyword in OpenGL 4.1, but isn’t in WebGL, so some shaders might use smooth as a variable name or function name somewhere in their code. In this case you need to rename it to something else.

#4

Hiya @Jherico ,

I took your shadertoy implementation example and documented it a bit and made it more complete. iResolution is supposed to have 3 values, Z is for the ratio, normally its 1.0 and iMouse contains clicked position in Z and W. Also added iGlobalTime to the W value of iDate, since this is used in quite a few toys.

Improved example:

// This line is here to make it easier to find the line in the debugger
#line 3
// Resolution X, Y, Z (ratio , by default 1.0)
vec3 iResolution = vec3(iWorldScale.xz, 1.0);
// Mouse X,Y coordinates, and Z,W are for the click position if clicked (not supported in High Fidelity at the moment)
vec4 iMouse = vec4(0);
// We set the seconds (iDate.w) of iDate to iGlobalTime, which contains the current date in seconds
vec4 iDate = vec4(0, 0, 0, iGlobalTime);

////////////////////////////////////////////////////////////////////////////////////
// REPLACE BELOW WITH SHADERTOY SCRIPT
////////////////////////////////////////////////////////////////////////////////////


// This example is available at: https://www.shadertoy.com/view/XtlGW4

void mainImage( out vec4 f, in vec2 p ){
    
    vec3 q=iResolution,d=vec3(p.xy-.5*q.xy,q.y)/q.y,c=vec3(0,.5,.7);
    
    q=d/(.1-d.y);
    float a=iDate.w, k=sin(.2*a), w = q.x *= q.x-=.05*k*k*k*q.z*q.z;

    f.xyz=d.y>.04?c:
    	sin(4.*q.z+40.*a)>0.?
        w>2.?c.xyx:w>1.2?d.zzz:c.yyy:
	    w>2.?c.xzx:w>1.2?c.yxx*2.:(w>.004?c:d).zzz;
    
}

////////////////////////////////////////////////////////////////////////////////////
// REPLACE ABOVE WITH SHADERTOY SCRIPT
////////////////////////////////////////////////////////////////////////////////////

// This is the function that is being called by us
vec4 getProceduralColor() {
    // retrieve the position to get the color
    vec2 position = _position.xz;
    // add a half to all the axes to adjust them to our method
    position += 0.5;
    // invert the y axis
    position.y = 1.0 - position.y;
    // initialize the result value
    vec4 result;
    // We call shadertoy their entry point here, which is mainImage for normal viewports
    // This function writes to the result value, as input we enter the position multiplied by the current worldscale
    mainImage(result, position * iWorldScale.xz);
    // Return the colour vector to our renderer in Interface
    return result;
}

Thanks,

Greetings,

Thoys


#5

Thanks for the information. However, if Z is supposed to be the (apect) ratio I would expect it to equal to iResolution.x / iResolution.y, but that doesn’t seem to be the case on shadertoy. Are you referring to some other ratio?


#6

I’ve submitted a PR for V2 of the procedural support here. If you want to try it out before it’s merged feel free to grab the PR build.

It contains the following changes:


A version field in the procedural userData.

A missing version field will default to ‘V1’ behavior, so there should be no change required to existing shaders. However, keep in mind that once the API stabilizes we may want to deprecate old versions, to reduce the amount of code maintenance required.


Initial support for more Shadertoy inputs

I’ve added iResolution which is initialized as iWorldSize.xzy, as well as iMouse and iDate initialized in the same way as Thoys has shown. Eventually iMouse and iDate will be initialized the same way Shadertoy does, but I’m adding them now so that when they are properly set I can do a V2.1 release that won’t break anyone’s shaders.


Removing the need for the #line declaration

The enclosing shader now includes a #line 1001 definition immediately before the spot where it will incorporate user shaders. This removes the need for the #line 2 declaration, and also makes it easy to tell in the log whether a compile error is in your shader or in the wrapper. If you see compile errors in the log with a line number of 1001 or greater, just subtract 1000 from them to get the proper line number. You can still use the #line definition if you prefer as it will override the one in the wrapper.


Support for lighting of procedural surfaces

In V2 the required function looks like this:

float getProceduralColors(inout vec3 diffuse, inout vec3 specular, inout float shininess)

In order to get the existing behavior, you would set the specular value to the RGB portion of whatever you were previously returning in V1 and then return 1.0 from the function.

In order to get a lit surface you’d set the diffuse color and return 0.0

Here’s an example that shows both behaviors.

// Produce a lit procedural surface
float getProceduralColorsLit(inout vec3 diffuse, inout vec3 specular, inout float shininess) {
    vec3 noiseColor = getNoiseColor();
    diffuse = noiseColor;
    return 0.0;
}

// Produce an unlit procedural surface:  emulates old behavior
float getProceduralColorsUnlit(inout vec3 diffuse, inout vec3 specular, inout float shininess) {
    vec3 noiseColor = getNoiseColor();
    diffuse = vec3(1.0);
    specular = noiseColor;
    return 1.0;
}

float getProceduralColors(inout vec3 diffuse, inout vec3 specular, inout float shininess) {
    return getProceduralColorsLit(diffuse, specular, shininess);
}

Note that you don’t have to set diffuse to white (vec3(1.0)), but if you don’t the color produced will be combined with the entity color. So if you have a shader that produces the color ‘yellow’ but your entity is ‘red’ then your result will be red, because the green color channel will be zeroed out when it combines the diffuse and specular colors in the lighting pass.

However, this provides one way of easily customizing the rendered results without having to edit the shader. If you create a shader that outputs a greyscale noise pattern, you can change what color the rendered result is by just changing the entity color.

For other ways of customizing the look of a procedural entity without having to edit the shader, read on…


Easier customizability on a per-entity basis

I’m adding support for named uniforms inside the user data. This means you’ll be able to write a single shader that you can customize in various ways for different entities.

For instance, suppose in your shader you had this

void someFunction() {
    ...
    float noise = snoise(vec4(_position.xyz, iGlobalTime));
    ...

}

But you want to change the animation speed for one entity without having to change the shader and re-upload it. With V2 you could do the following

uniform float iSpeed = 1.0;
void someFunction() {
    ...
    float noise = snoise(vec4(_position.xyz, iGlobalTime * iSpeed));
    ...
}

And in your entity’s user data you would put

{
    "ProceduralEntity": {
        "shaderUrl": "file:///C:/Users/bdavis/Git/hifi/examples/shaders/exampleV2.fs",
        "version": 2,
        "uniforms": {
            "iSpeed": 2.0
        }
    }
}

Now, that entity will have noise values that change twice as fast as other entities without that iSpeed declaration in the user data.

Uniforms specified this way can either be a number or an array of numbers, but the size of the array must match the size of the uniform. In other words a number or a an array of size 1 must correspond to a uniform declared as a float , while an array of size 2 must correspond to a uniform declared as a vec2.

Also bear in mind that you need to provide a default value for the uniform, or it will either default to 0 or undefined behavior (I can’t remember at the moment and can’t be bothered to look it up in the spec. Just don’t leave it uninitialized).


Backlog of features I’d like to add

These are features I’d like to add, in a kind of vague priority order. However, bear in mind this is something of a pet project, so how much of this I get done depends on the amount of motivation I have in my free time, as well as how much free time I actually get.

  • Procedural shaders in skyboxes
  • An wrapper example for using VR compatible shadertoys as skyboxes
  • Better compatibility with Shadertoy inputs
  • Add iResolution as a standard input and set it equal to iWorldScale.xzy for now
  • Add iMouse as a standard input, but without the click position for now
  • Add iDate as a standard input with values equivalent to what Shadertoy provides
  • Textures
  • Standard Shadertoy 2D textures (their video and and audio texture support is probably further off though)
  • Arbitrary user specified textures
  • Multiple shader URLs, so that you can keep a library of commonly used functions in a central location and use it from various shaders
  • An in-app editor for shaders and manipulation of shader data on an entites
  • Support for more procedurally rendered primitive shapes
  • regular polygons like octahedrons, dodecahedrons, etc
  • curved surfaces like cylinders, tori, cones
  • Constructive Solid Geometry (this is pretty long term though, cause… hard)
  • Feedback on shader GPU cost
  • It’s pretty easy to create a shader that will cause the GPU to thrash and kill your frame rate, so having some kind of feedback on how much a shader is costing
  • Dynamic shader scaling
  • Ideally, you could set a specific budget for a shader and have it scale the virtual resolution of the surface to reduce the cost until you fell under the budget. However, this would be a pretty big change in the way entities are rendered, so this is another very long term, if ever, goal.

#7

All of this sounds very good. The future skybox support sounds like it will be a good thing. As does the lighting support.

On my big computer I don’t notice any performance issues at all. On old laptop, it might be different. I will try it out soon.

One thing I’d like to see is inverted entity rendering … in other words procedurally rendered inside faces of entities. This would be similar to skyboxes but difference would be skyboxes render past limit of any object where as inside of entities would render on the inside face of the entity.

This may not really be an issue. Maybe there is simple workaround.

Oh… and while I am in wish list mode… how about Polyvox faces?.. Oh and alpha transparency channel would be nice too.

And thanks for a very nice and fun addition to HiFi.


#8

Hiya @Jherico,

On the ratio, I just followed their help file, for iResolution it tells you z is pixel aspect ratio:

Shadertoy specific inputs

vec3		iResolution	image	The viewport resolution (z is pixel aspect ratio, usually 1.0)
float		iGlobalTime	image/sound	Current time in seconds
float		iChannelTime[4]	image	Time for channel (if video or sound), in seconds
vec3		iChannelResolution0..3	image/sound	Input texture resolution for each channel
vec4		iMouse	image	xy = current pixel coords (if LMB is down). zw = click pixel
sampler2D	iChannel{i}	image/sound	Sampler for input textures i
vec4		iDate	image/sound	Year, month, day, time in seconds in .xyzw
float		iSampleRate	image/sound	The sound sample rate (typically 44100)

#9

I went over the http://dreaming to find this amazing setup!


#10

Support for procedurally rendered skyboxes has been added. However skyboxes work a little bit differently than entities. A skybox can’t be lit, and it has only one varying input: _normal. This _normal value is what we would otherwise use to do a cubemap lookup, so you can treat it as the direction of the pixel from the viewer.

The function definition for a skybox shader is vec3 getSkyboxColor(). You can see an example of one here.

Additionally, I wanted to mention that while we still don’t support alpha values in the entity shaders, you can discard a pixel. Using the discard keyword in a shader simply means “don’t render anything at this pixel”. Combined with calculations inside the pixel shader as to where the pixel is relative to the shape, this allows you to do things like this

https://alphas-new.highfidelity.io/uploads/highfidelity/4450/1afbf9e73f2140ee.jpg

Both the clock shader and the floor shader are using the discard keyword to modify the shape from a square to something more interesting (circle and hexagon).

Bear in mind that discarding pixels only prevents them from being rendered. Physics calculations will still treat the entity as box for the purposes of collisions and other physics based effects.


Adapted Cloud shaders
#11

Judicious use of discard, nested spheres and a noise function


#12

Here is a script to create a shader ball as shown in the previous post.

QuorraBall = function() {};
QuorraBall.prototype.SIZE = 0.5;
QuorraBall.prototype.SHELLS = 14;
QuorraBall.prototype.INTENSITY_INCREMENT = 1.0 / (QuorraBall.SHELLS + 1);
QuorraBall.prototype.SIZE_INCREMENT = QuorraBall.SIZE * QuorraBall.INTENSITY_INCREMENT;
QuorraBall.prototype.NAME = "QuorraBall";
QuorraBall.prototype.POSITION = { x: 0, y: 0.5, z: -2.5 }; 
QuorraBall.prototype.COLOR = { red: 220, green: 220, blue: 220 };
QuorraBall.prototype.USER_DATA = { ProceduralEntity: {
        version: 2,
        shaderUrl: "https://s3.amazonaws.com/Oculus/shadertoys/quora2.fs",
        uniforms: { iSpeed: 1.0, iShell: 1.0 }
} };

// Clear any previous entities within 50 meters
QuorraBall.prototype.clear = function() {
    var ids = Entities.findEntities(MyAvatar.position, 50);
    var that = this;
    ids.forEach(function(id) {
        var properties = Entities.getEntityProperties(id);
        if (properties.name == that.NAME) {
            Entities.deleteEntity(id);
        }
    }, this);
}

QuorraBall.prototype.createBall = function(i) {
    var that = this;
    var intensity = 1.0 / (this.SHELLS + 1);
    var increment = this.SIZE * intensity;
    var size = that.SIZE - i * increment;
    var userData = JSON.parse(JSON.stringify(that.USER_DATA));
    userData.ProceduralEntity.uniforms.iShell = 1.0 - i * intensity;
    var currentSize = 0.05;
    var id = Entities.addEntity({
        type: "Sphere",
        position: Vec3.sum(MyAvatar.position, Vec3.multiplyQbyV(MyAvatar.orientation, that.POSITION)),
        name: that.NAME,
        color: that.COLOR,
        ignoreCollisions: true,
        collisionsWillMove: false,
        dimensions: { x: currentSize, y: currentSize, z: currentSize },
        userData: JSON.stringify(userData)
    });
    var updateSize = function(){
        var difference = size - currentSize;
        var newSize = size;
        if (difference > 0.005) {
            newSize = currentSize + difference * 0.1;
        }
        currentSize = newSize;
        Entities.editEntity(id, { dimensions: { x: currentSize, y: currentSize, z: currentSize } });
        if (difference > 0.005) {
            Script.setTimeout(updateSize, 10);
        }
    };
    Script.setTimeout(updateSize, i * 500);
}

QuorraBall.prototype.create = function() {
    for (var i = this.SHELLS - 1; i >= 0; --i) {
        this.createBall(i);
    }
}

var quorraBall = new QuorraBall();
quorraBall.clear();
quorraBall.create();

The shader, referenced in the S3 URL looks like this:

const vec3 COLOR = vec3(24.0, 202.0, 230.0) / 255.0;

uniform float iWidth = 0.004;
uniform float iMiddle = 0.5;
uniform float iShell = 1.0;
uniform float iSpeed = 1.0;

vec4 getProceduralColor() {
    float intensity = 0.0;
    float time = iGlobalTime / 5.0 * iSpeed;
    vec3 position = _position.xyz * 1.5 * iWorldScale;
    for (int i = 0; i < 3; ++i) {
        float modifier = pow(2, i);
        vec3 noisePosition = position * modifier;
        float noise = snoise(vec4(noisePosition, time));
        noise /= modifier;
        intensity += noise;
    }
    intensity += 1.0;
    intensity /= 2.0;
    if (intensity > iMiddle + iWidth || intensity < iMiddle - iWidth) {
        discard;
    }
    return vec4(COLOR * iShell, 1);
}

float getProceduralColors(inout vec3 diffuse, inout vec3 specular, inout float shininess) {
    specular = getProceduralColor().rgb;
    return 1.0;
}

#13

I’m only six minutes into this video and I had to pause just to say, “thanks”. This is going to be AMAZING!


special note


For individuals using white space in their file paths… YOU WILL GET ERROR. [WARNING] [09/11 19:41:22] Invalid shader URL

This could be cleaned up in future releases, but for now, if you save shaders in a sub-dir example:
"…/High Fidelity/shaders/…fs" it will not find it. Remove the space or perhaps use HTML encoding. %20 ?

This proc shader is the bee’s-knees :slight_smile:


#14

First Skybox Shader I’ve completed. This is for those folks that would like a simple blue sky, and a simple blue ocean beneath them.

float time = iGlobalTime; vec3 getSkyboxColor() { vec3 rayDir = normalize(_normal); float specialSauce = 0.0; float dotProd = dot(rayDir, vec3(0.0, 1.0, 0.0)); specialSauce = pow(.5 - abs(dotProd), 2.0) + step(0.0, rayDir.y)*.3 + ((sin(rayDir.y)+cos(rayDir.x))*.002); vec3 colorTransform = vec3(.47,.37,.05); vec3 finalAnswer = pow(vec3(specialSauce), colorTransform); return finalAnswer; }

This is a lot more fun than I expected it to be. :smiley:


New marquee shader. Certainly nothing to write home about (as the expression goes) but I intend to continue practicing with ShaderToy and working this into HiFi. I would like to integrate different shaders into an easy to use Interface as mentioned to @chris during the last meetup.

[code]
float disk(vec2 r, vec2 center, float radius) {
return 1.0 - smoothstep( radius-0.005, radius+0.005, length(r-center));
}

void mainImage( out vec4 fragColor, in vec2 fragCoord )
{
vec2 p = vec2(fragCoord.xy / iResolution.xy);
vec2 r = 2.0vec2(fragCoord.xy - 0.5iResolution.xy)/iResolution.y;
float xMax = iResolution.x/iResolution.y;

vec3 col1 = vec3(0.216, 0.471, 0.698); // blue
vec3 col2 = vec3(1.00, 0.329, 0.298); // red
vec3 col3 = vec3(0.867, 0.910, 0.247); // yellow


// theSpaceBetween creates a gap, 3.0 will be every-other slot
float theSpaceBetween = 3.0;

vec3 ret;	

	vec2 q = r + vec2(xMax*4./5.,0.);
	ret = vec3(0.2);
	// y coordinate depends on time
	float y = iGlobalTime;
	// mod constraints y to be between 0.0 and 2.0,
	// and y jumps from 2.0 to 0.0
	// substracting -1.0 makes y jump from 1.0 to -1.0
	y = mod(y, 2.0) - 1.0;

//speed modifier added for those that want control...
// speedControl

float speedControl = 4.0;

for(float i=-2.0; i<2.0; i+=0.1) {
    ret = mix(ret, col3, disk(q, vec2((theSpaceBetween*i+i)+(y*1.3*speedControl), 0.5), 0.1) );
 }


vec3 pixel = ret;
fragColor = vec4(pixel, 1.0);

} [/code]

This will create a set of simple yellow circles moving across the frame:

Again, nothing to get SUPER excited about, but I will continue to focus on this aspect as it is enjoyable for me. Do we currently have the ability to add shader to normal .fbx entity? In practice, I would likely create regular marquee “bulbs” and a single shader could be applied to them all. The shader would take care of operating illumination based on entity location.

There are MANY interesting things we can do with this level of shader support!
PlasmaShader from ShaderFrog + ParticleEmitter from HiFi = RayGun :smile:

I’ve just uploaded a new shader to http://metaversecafe.com/HighFidelity/QueenCity/shaders/eyeball_green.fs

This eyeball shader is a tutorial by IQ, I have learned a lot from this tutorial. Unfortunately, you cannot seem to use noise textures so I borrowed a hash function to achieve the desired effect. Here is the code annotated with comments from what I learned during the tutorial. Please note converting these shaders from pure GLSL into HiFi can be tricky.

vec3 iResolution = vec3(iWorldScale.xz, 1.0);
vec3 col=vec3(1.0);

float hash( float n ) { return fract(sin(n)*753.5453123); }
float noise( in vec2 x )
{
    vec2 p = floor(x);
    vec2 f = fract(x);
    f = f*f*(3.0-2.0*f);
	
    float n = p.x + p.y*157.0;
    return mix(
				mix(
						mix( hash(n+  0.0), hash(n+  1.0),f.x), 	mix( hash(n+157.0), hash(n+158.0),f.x),		f.x
					),
               mix(
						mix( hash(n+113.0),	hash(n+114.0),f.x),     mix( hash(n+270.0), hash(n+271.0),f.x),		f.y
				),
				
				mix(
						mix( hash(n+337.0),	hash(n+339.0),f.y),     mix( hash(n+559.0), hash(n+559.0),f.y),		f.x
				)
			);
}


mat2 m = mat2( 0.8, 0.6, -0.6, 0.8);
float fbm( vec2 p )
{
    float f = 0.0;
    f += 0.5000*noise( p ); 
    p*=m*2.02;
    f += 0.2500*noise( p );
    p*=m*2.03;
    f += 0.1250*noise( p );
    p*=m*2.01;
    f+= 0.0625*noise( p );
    p*=m*2.04;
    f /= 0.9375;    
    return f;
}

void mainImage( out vec4 fragColor, in vec2 fragCoord )
{
	vec2 q = fragCoord.xy / iResolution.xy;
    vec2 p = -1.0 + 2.0*q;
    //aspect ratio fix... divide the width by the height (landscape mode)
 //   p.x *= iResolution.x/iResolution.y;  
    
    
    //draw a circle with polar coordinates
    float r = sqrt( dot(p,p)/2.6 );
    float a = atan(p.y,p.x);
    
    //animation
    float ss = 0.5 + 0.5*sin(8.0*iGlobalTime);
    float anim =  1.0 + 0.1*ss*clamp(1.0-r,0.0,1.0);
	r *= anim;
    
    
    if( r<0.9 )
    {
        //set the color of the eye
     	col = vec3( 0.2, 0.3, 0.4) ;
        
        //add variation
        float f = fbm( 1.173*p );
        
        //mix the color into the variation
        col = mix( col, vec3(0.2,0.5,0.4), f );
		
        //add a new ring of color (this is a variation of the base color, so use f)
        f = 1.0 - smoothstep(0.2, 0.5, r);
        col = mix( col, vec3(0.9, 0.6, 0.2), f);
        
        //apply some domain distortion (helps make things look organic and natural)
        //adjust the amplitude and frequency to suit.
        a += 0.1*fbm( 20.0 * p );        
        
        //add some white into it... (use polar coordinates so radiates from center)
        //add some contrast by smoothstepping the fbm
        f = smoothstep( 0.3, 1.0, fbm( vec2(5.0*r,20.0*a) ) );
        col = mix(col, vec3(1.0), f);
        
        //add different colors by copying the white addition...
        //change the frequency of the fbm so they do not completely overlayp
        f = smoothstep( 0.4, 0.9, fbm( vec2(10.0*r,15.0*a) ) );
        //modulate the color by multiplying by f
        col *= 1.0 - 0.5*f; 
        
        //add a volume effect to the eyeball
        f = smoothstep( 0.5, 0.8, r);
        col *= 1.0 - 0.5*f;
        
		//add the pupil on top of the previously added iris
        f = smoothstep ( 0.2, 0.23, r);
		col *= f;
       
        //anit-alias the edges
        f = smoothstep( 0.75, 0.8, r);
        col = mix( col, vec3(1.0), f );
    }   
    
	fragColor = vec4(col,1.0);
}

vec4 getProceduralColor() {
    vec2 position = _position.xz;
    position += 0.5;
    position.y = 1.0 - position.y;
    vec4 result;
    mainImage(result, position * iWorldScale.xz);
    return result;
}

Also desired is the opportunity to apply these shaders to custom .fbx and .obj entities!


#15

Com plete lost, tried the fire from marketplace. but nothing seems to happen. also cannot see where to set the coordinates.

Looked at the sandbox , model. almost got it. until you i figured out i cannot past info in the user data field because the other bug.


#16

@chris The fire “shader” I had propegated to Marketplace was essentially just an .fs that gets translated into a .js IT DOES NOT GIVE YOU ANYTHING AND SHOULD BE REMOVED…

What I learned:
You must use a shader and dynamically “place” it onto a .js using scripting.
For an example of this: see also (run this in your running scripts from URL)
http://metaversecafes.com/HighFidelity/QueenCity/shaders/proto_foreverflow_box_v2.js

and

http://metaversecafes.com/HighFidelity/QueenCity/shaders/proto_foreverflow_v2_sphere.js

((clicking these links will force a 403, that is intentional - they function in HiFi))

If you want the flame shader, you will find “flame_v7.fs” a shader file, in the same directory.

HAVE FUN!


#17

@AlphaVersionD did you try attaching your shader script to an entity before uploading it to the marketplace?


#18

Yes, and that part works fine, or at least as expected. So for my future reference as well as others: when uploading a shader file to MP, do we upload the core .fs, or should we upload a .js that will add it to an entity and spawn on-run? I have added a .js to MP (pending approval) for the above two .js files

update

I just realized your question could in actuality be a bug in MP, to revisit your question: yes, I tested the version 7 flame shader on a box entity, prior to uploading it via the MP uploader. So, it was on an entity, and was functional, so I performed the upload. My guess is MP should then take that file and add it to an entity when user clicks “Get” but perhaps this functionality is not yet built into MP.

Cheers!


#19

Hey @Jherico! Did V2 ever get pulled into the public build? I’m trying to integrate the following userData and it doesn’t appear to register on my entity.

    {
        "ProceduralEntity": {
            "shaderUrl": "file:///C:/[location]/control_burn_v2.fs",            
            "version": 2,
            "uniforms": {
                // Flame hold-together-ness, :) send this number from 0.1 - 5.0 for effectiveness
                "ud_viscosity": 1.8,
				// strength
				"ud_thrust": 3.0,
                // Color of Flame, this is only has three options at this time FLOAT
				// 1.0 = default red and orange flames
				// 2.0 = blue flame
				// 3.0 = green flame
                // 4.0 = purple flame				
                "ud_color": 1.0,
				//Disgards a pixel from the render (kind of like alpha, but not quite) 
				// think of this as a threashold measure of a pixel vector, valid 1 - 10. 10.0 will disgard 100% of pixels
				"ud_disgardFactor": 1.0
				
            }
        }
    }

and if integrated would do this:

I didn’t see anything in the release notes over the last 20-something versions and was wondering when this functionality would be made available. Thanks!

[[bump]]

@chris do you know if or when V2 of shader integration is coming?


#20

Both V2 of the shaders and texture support are both in the production code already. If your shader effect isn’t doing what you expect, I would suggest tailing the log file and seeing if it’s reporting a compilation error.

Here is an example of a V2 shader that uses textures to display a kind of poster that rotates between 4 different images…

{
    "ProceduralEntity":{
        "version":2,
        "shaderUrl":"https://s3.amazonaws.com/DreamingContent/shaders/image.fs",
        "uniforms":{"iSpeed":1,"iShell":1},
        "channels":[
            "https://s3.amazonaws.com/DreamingContent/images/12th.jpg",
            "https://s3.amazonaws.com/DreamingContent/images/11th.jpg",
            "https://s3.amazonaws.com/DreamingContent/images/10th.jpg",
            "https://s3.amazonaws.com/DreamingContent/images/timey-wimey.jpg"
        ]
    }
}

If you look at the shader source itself you’ll see that the important function looks like this:

float getProceduralColors(inout vec3 diffuse, inout vec3 specular, inout float shininess) {
    if (_position.z > -0.49) {
        discard;
    }

    float t = mod(iGlobalTime / 5.0, 4.0);
    float f = fract(t);
    vec3 color = indexedTexture(t);
    if (f > 0.9) {
        t = mod(t + 1.0, 4.0);
        vec3 color2 = indexedTexture(t);
        color = mix(color, color2, smoothstep(0.9, 1.0, f));
    }
    specular = color;
    return 1.0;
}

I’m setting the specular value here to my color and returning 1.0 to indicate that I want this to be an emissive surface, i.e. no lighting effects. Essentially this makes it look like a kind of in-world TV.

If I returned 0 instead and set the diffuse instead of the specular color then it would end up being lit based on scene lighting and look more like a painting.

Work in progress and on the backlog…

Skyboxes

I’d like to do some back-end work on the procedural skyboxes. Right now it’s really easy to get a shader that will kill the frame rate if it’s doing serious calculation. This is particularly bad for VR. On my roadmap I have a task to put the rendering of the skybox into a background thread, rendering to a cubemap rather than directly to the screen. This will increase the overall amount of work the skybox renderer has to perform, since the cubemap might be many more pixels than the actual skybox portion of the screen, but will allow us to render arbitrarily expensive shaders for the skybox without impacting the frame rate. We’ll also be able to exert control over the size of the cubemap textures so that a more expensive skybox can be rendered at a lower resolution in order to limit the overall GPU cost.

Content development

I’m also working on a mechanism right now to allow much easier development on shaders and scripts without having to change the URLs for them to a local file URL. The idea is that you’ll be able to load a script that will call a function looking something like this:

Resources.setUrlPrefixOverride(
    "https://s3.amazonaws.com/DreamingContent/", 
    "file:///C:/Users/bdavis/Git/dreaming/");

The idea here is that with the prefix set, when content gets loaded into the scene, if it sees the URL prefix in the first parameter it will replace it with the URL prefix from the second parameter. This means that the person running the script can be looking at local content, while everyone else who visits a domain sees the server content. You can make changes to the local files (and where supported take advantage of the fact that Inteface will automatically reload changed files when you save them) until you’re satisfied with your work, then push the local content to the server. I use AWS S3 and a command line client so I can publish all my local work like this: aws s3 sync C:/Users/bdavis/Git/dreaming s3://DreamingContent

This is a lot less tedious than editing an entity to point to a local file (and breaking it for anyone else who visits your domain) and then editing back after you’ve made changes and uploaded them. I’ll do a full top level post in announcements once this work is done and in production.


Javascript Office Hours 8/3/2016 11AM
Billboard sprites