New WebGL Shader System (Kiwi.js v1.1.0)

Hello, and welcome to the seventh – and last – in a series of in-depth articles about Kiwi.js v1.1.0, now available for use at www.kiwijs.org. Today we’re talking about WebGL shaders and how they can do amazing things. I’m Ben Richards, and I’ve put a lot of effort into bringing powerful shaders to Kiwi.js.

This is a complex topic. It’s going to be quite long, and involve a fair amount of code. Before you begin, you should be familiar with the following topics covered in previous articles:

You should also be familiar with the procedure for creating a Kiwi.js plugin. You will need to do a fair amount of research on your own to fully understand the topic, but this should be enough to get you started.

Who Needs to Read This Post?

Advanced users. If you want to create WebGL shaders, this is where it all comes together. The article is targeted to Kiwi.js users, and depends to some extent on the Kiwi.js infrastructure. However, the principles may also be useful for other projects. If you just want to use advanced shaders that somebody else has coded, things are much simpler – they should come with instructions.

Elements of a Shader

In general terms, a WebGL shader comes in two parts: a vertex shader and a fragment shader. Together these make a program; in Kiwi.js, we store them as a shader pair. Information is passed into shaders as uniforms.

Vertex Shader

This code performs operations on the vertices of an object. These are the corners of objects; in Kiwi.js, this is usually the corners of a square. Vertex shaders are generally used to control position, movement, and generic lighting.

Fragment Shader

This code performs operations on fragments. A fragment is essentially a single pixel. It inherits information from the vertex shader. Fragments can control colour, texture, advanced lighting, and more. Most advanced trickery occurs in fragment shaders.

Shader Pair – Program

The graphics card renders using a vertex shader and a fragment shader. These are compiled and used together. In Kiwi.js, we store the program in a shader pair. The basic shader pair is Kiwi.Shaders.ShaderPair, which provides certain key functions used under the hood. It also stores the vertex and fragment shaders.

Uniforms

A uniform is a variable sent to a shader program. The vertex and fragment shaders both have their own uniforms. A uniform can be one of several types, determined within the shader. It is called “uniform” because it does not change during the time the frame renders; it is always the same value for every vertex and pixel being rendered. You can change the uniform once you’re done.

Creating a Shader Plugin

Today we’re going to create a shader. There are infinitely many possibilities available. I’ve decided to create a “ripple shader”, a system that distorts the texture being drawn as though it were reflected in water.

Groundwork

Before we can even begin to work on our shader, we need to be able to see it. We must apply the shader to an object. This requires a little bit of work. We’re going to need a custom renderer, a multi-texture atlas, and a couple of stages of setup.

Custom Renderer

Shader pairs live on Renderer objects. You need a unique Renderer to separate your effects from the rest of the scene. In most cases, you’ll also need a renderer to set unique uniforms on the shader pair.

While is is possible to extend Kiwi.Renderers.Renderer (the basic renderer prototype), the great majority of shaders will be designed to work with textured quadrilateral objects – Kiwi.js assets. For this reason, it’s usually better to extend Kiwi.Renderers.TextureAtlasRenderer, the default renderer, designed to do most of the heavy lifting for you.

Most of the TextureAtlasRenderer code is already fit for purpose; the only place where unique uniforms are discussed is in TextureAtlasRenderer.enable(). This should be the only piece of code you need to alter or extend.

As part of your plugin, you should create an extended renderer. This should be in the Kiwi.Renderers namespace so that it can be recognised by the engine.

We will revisit the custom code later, once we understand what kind of uniforms we need.

Multi-Texture Atlas

Most shaders will require a multi-texture atlas, as one image simply can’t contain enough data to fully describe a complex situation. We’ve expanded the texture pipeline to support a multi-texture plugin. For this example we’ll include multi-texture functionality with the shader plugin itself.

We hope to make a formal multi-texture plugin available very soon. It is best practice to make such a fundamental tool available independent of however many shaders you are using. We shouldn’t really be including it in the same plugin. Try not to do things this way – it will save you time and worry in the long term.

Anyway, here’s a multi-texture atlas:

Game Implementation

In addition, we need to do a few more things to use these structures. This code does not belong in the plugin; it must be part of your game code itself.

First, we must have images and cells to feed to the constructor. We can do this by stealing from normal images that were loaded earlier.

Second, we must fold the source images into a list.

Third, we must actually create the MultiTextureAtlas. Note that we’ve called it multiTextureAtlas, which is very clear in this context, and would be terrible practice if you were actually using it in a game. You should probably name it something better suited to its purpose.

Fourth, we must add the MultiTextureAtlas to the state texture library.

Finally, we must apply the atlas and the renderer to the target entity. Note that the renderer requires very little setup: it is automatically available.

Note that this should have no visual consequences so far. The MultiTextureAtlas returns image1, and the RippleShaderRenderer renders it using a default set of shaders.

Let’s change that.

Make a Shader

Shaders are all about pushing data around, point by point. To do this we use GLSL (OpenGL Shader Language), a language based on C syntax. Don’t panic; it’s pretty simple.

The Shader Pair

Let’s pull out usual trick, and extend an extant prototype.

The Vertex Shader

We’re not going to do anything special with vertex data, so we’ll just copy the vertex shader from Kiwi.Shaders.TextureAtlasShader.

Here we see several important features of GLSL. Let’s step through some key aspects.

attribute: a piece of data used to draw an object, or array of data used to draw several objects. In this case, Kiwi.js already supplies attributes regarding position and texture mapping.

uniform: as described above, a piece of data that’s constant across all objects drawn by this shader. These are the primary way to communicate data to shaders.

varying: a piece of data used inside the shader pipeline. Think of varying variables as messengers sent from the vertex shader to the fragment shader. You cannot access them from the outside, or send messages back the other way.

float: a piece of data that is a floating-point number. This means it has a decimal place, for example 0.321. Most data in a shader is made up of floats.

vec4: just like float, except it’s four numbers. For example, myVec4 = vec4(1.0, 0.5, 0.0, 0.2). There are also vec3s and vec2s.

vec4.xyzw and vec4.rgba and vec4.stpq are swizzles. A swizzle is a quick way of getting terms out of a vec. You must swizzle from the same set, e.g. you cannot mix rgba terminology with xyzw. However, within that set, you can access them any way you want. For example, myVec4.xz is (1.0, 0.0).

You can directly add, multiply, divide etc. swizzled vecs together. For example, myVec4.xy * myVec4.zw results in (0.0, 0.1). Swizzling is pretty neat.

mat3: a 3×3 matrix. This is where the high sorcery dwells. Matrix mathematics can do amazing things if you wrap your head around it. In this case, we’re using it to do arcane trickery to map screen coordinates and camera transformation to the WebGL context coordinate system. Like I said, high sorcery.

void main(void): declares the actual program. Here we calculate some values, set some varyings, and set some key properties.

gl_Position: a key property. This one sets the position of the vertex. (It happens to do so in four dimensions for computational reasons.)

Our Algorithm

Before we design our fragment shader, let’s think about what it actually has to do.

We want to create ripples. This means that parts of the image will move back and forth. We could do this by subdividing the object into hundreds of little patches and animating the vertices, but that is actually quite inefficient. Instead we’ll use a little trick: coordinate tweaking.

I happen to know that fragment shaders get colour by sampling from a texture using coordinates. All we have to do is modulate those coordinates in a regular, periodic fashion – and the texture will appear to swim back and forth.

Ah, but which way will they swim? Up and down? Side to side? What direction are the ripples travelling?

We can solve this in one of two ways. You could define a center using uniforms, then compute direction from there. In this case, because we want to explore the use of textures, we’ll use a pre-computed map instead.

Normal map

This is a normal map. These are used a lot in computer graphics, often to create nifty lighting effects. Today we’ll be using the map to indicate direction instead. Consider each pixel to represent a vector, where RGB color channels map to XYZ direction values. For example, flat blue regions are (0,0,1) – a vector pointing straight up, meaning a flat surface. This is a very quick way to get information into a shader.

(Technical note: In fact, flat blue regions are (0.5, 0.5, 1.0). The normal map presents a normalised vector which could point straight down along the Z-axis, i.e. (0, 0, -1.0). Because colour space does not support negative numbers, we have to pack the range (-1, 1) into the range (0, 1). We include a conversion term in the shader, below.)

This map consists of vectors pointing away from the center. This will allow waves to move in the right direction across the texture.

Alright, but how high should the waves be at each point? They should go up-down-up-down as they radiate from the center.

As with all things in computer graphics, we’re going to fake this. We’ll take another map, a height map consisting of grayscale values. (If we were being particularly efficient, we would store this as a single channel texture. However, that would involve a lot more under-the-hood work.) This map describes the distance from the center of the object.

Height map with black radial gradient at center

Now we know how far we are from the center, and what direction the waves are going. What now? We do one of my favourite things: apply a sine wave.

A sine wave goes up and down as the value increases. We’ll feed the distance from the height map into a sine wave, so as we move away from the middle, the waves go up and down.

We’ll also need a couple of extra parameters in the sine function:

  • A = Amplitude, how tall the waves are.
  • F = Frequency, how long the waves are.
  • P = Phase, how far along the waves are. We should animate this to cause the waves to move over time.

Now we have enough information to animate an offset, causing our fragment shader to seek texture information from changing parts of the texture. The rough steps are:

  • Obtain direction from texture.
  • Obtain distance from texture.
  • Compute wave amplitude * sin( distance * frequency + phase ).
  • Compute offset by multiplying direction.xy by computed wave.
  • Add offset to standard texture coordinates.
  • Draw!

Making Normal Maps

You don’t ususally make normal maps by hand. Instead, you compute them from other, more intuitive forms of data. In this case I actually did things backwards: I made the height map first, then converted it to a normal map.

There are several tools available for normal map conversion. I used a program called CrazyBump. You can download normal map tools for Photoshop from NVidia. You can also get good results from an online tool called NormalMap-Online.

The Fragment Shader

So let’s put all that into a shader. To start with, we’ll just copy the TextureAtlasShader fragment shader:

Note that this has a couple of additional features.

  • precision mediump float: requests that the video card uses a certain amount of precision. WebGL allows devices to have high precision, but it doesn’t require anything more than medium. Bear this in mind if you’re developing for multiple platforms: some may not support high precision.
  • varying: these are receiving messages from the vertex shader.
  • sampler2D: this is a texture.
  • texture2D: this is a texture lookup. It takes a sampler2D and an xy coordinate, and returns the color of the texture at that coordinate. Note that the coordinates are in GL space, which is always 0-1.
  • gl_FragColor: this is the final output, the RGBA color of the pixel in question. Set this to make the video card draw a color.

So this default fragment shader just receives texture coordinates from the vertex shader, looks them up on a texture map, and draws that texture to the screen.

Adding Textures

For starters, we need three textures at once: the diffuse, the normal, and the height. (Diffuse refers to a flat texture without any particular lighting information. It represents a surface illuminated by a diffuse light.)

We’re going to create some more sampler2Ds. Our uniforms in the shader now read:

We are leaving the diffuse channel as uSampler, because this is called out by the TextureAtlasRenderer and we don’t want to change the way it works.

Adding Sine Waves

Now we’ll add some controls for sculpting the ripples. We need three more values: amplitude, frequency, and phase.

Now we need to calculate the offset. Remember the steps we laid out when we were designing the algorithm:

  • Obtain direction from texture.
  • Obtain distance from texture.
  • Compute wave amplitude * sin( distance * frequency + phase ).
  • Compute offset by multiplying direction.xy by computed wave.
  • Add offset to standard texture coordinates.
  • Draw!

This translates to the following code:

Adding Shaders to ShaderPairs

The video card actually expects these shaders as strings. The ShaderPair object holds those strings in two variables called vertSource and fragSource. Break up your shader code into lines, add them to an array, and assign them within the “configure shaders” step in your ShaderPair code.

Kiwi.js will take care of the rest, compiling the shaders and making sure they’re appied to the correct drawing operations.

Setting Uniforms

If you tried this now, it wouldn’t work. We need to actually set those new uniforms before the shader will function.

Let’s start with the floats, because they’re simplest. The TextureAtlasShader prototype has already created and populated an object called uniforms. You just need to add new properties to this object within the “configure uniforms” stage of your ShaderPair code. Each uniform must be registered as an object containing a type.

As you’ll note, we’re declaring that these are “type 1f”. What does this mean? It means they are floats with 1 value. WebGL uses vectors so often that it thinks of single numbers as a special case of several numbers: a collection with 1 member. If you look at the source code for Kiwi.js, you’ll find that TextureAtlasShader also uses mat3, 2fv, and 1i. These represent a 3×3 matrix, a vector with 2 floats in a special array, and a solo integer. There are numerous data types available, and you should spend some time researching these.

After telling the game that these uniforms exist, we must set them. Remember way back when, we left a space for uniforms in our renderer? Now we’re going to fill it in. We’re also going to set some properties on the renderer itself, to give us better control over the shader without having to edit the shader code itself:

Setting Uniform Textures

Now let’s get the textures into the shader.

First, we need to make sure that the MultiTextureAtlas is loaded with all three textures, and in a known order. We’ll order them [diffuse, normal, height], so we know that the MultiTextureAtlas will load diffuse into texture unit 0, normal into texture unit 1, and height into texture unit 2.

Now that we know these textures are in specific texture units, we need simply set the uniforms in the ShaderPair to look at those numbered units. The diffuse texture is already set to unit 0 by the default TextureAtlasRenderer. We need simply assign texture units to the other uniforms:

And, of course, declare the uniforms in the ShaderPair itself:

And that’s it – our uniforms are all set.

Apply Shader Pair

Finally, we want to apply the ShaderPair to the renderer. We do this by setting a string in the renderer:

Final Plugin

Now the plugin is ready. Let’s review the final code. Here’s the plugin:

ripple_shader_plugin_v1.0.0.js:

And here’s the game code to display it:

index.html:

demoScene.js:

And here’s what it looks like in action!

Download zipped source code here.

(If you cannot see the game, your browser is likely not compatible with WebGL standards. You can see a screenshot of what you’re missing out on here.)

Extending the Shader to Animation and Beyond

This shader has many flaws and shortcomings. For example, you’ll find that may react strangely if you try to use it with sprite sheets and animation in Kiwi.js. For the purposes of demonstration, I restricted it to a square texture with no animation. You will have to do more research if you want to bring out the full power of shaders.

Most of this is due to the fact that sprite sheets use “cells”, subdivisions of a single image. The MultiTextureAtlas assumes that all sources have the same cell layout, so if you were to try to animate our rippling shadow warrior, you’d need three sprite sheets: one for the diffuse, and one each for the normal and height maps, repeating the same basic information over and over.

If you’re interested in shaders, you will need to do a lot of research, and that’s a fact.

Cheat Sheet

Here’s a code template for you to fill in with your own crazy shaders:

This template doesn’t include texturing, multitexturing, or creating and assigning renderers or shaders to game objects. You’ll have to do that on your own.

In Review

Whew! Today we learned how to create a whole shader pipeline. We’ve covered vertex shaders, fragment shaders, uniforms, GLSL and its variable types including vecs and swizzling, custom renderers, custom shader pairs, normal maps, multitextures, and a whole lot more. I sincerely hope this was useful. We put a lot of work into making this technology available to you.

This has been the last article in the launch series for Kiwi.js v1.1.0. Thanks for sticking through to the end. You haven’t heard the last of us; Kiwi.js is always growing and improving. We’re going to add some amazing new stuff to the library in the future, and I expect I’ll be back to explain it.

In the meantime, don’t hesitate to ask about any details in the forums or using our various social media systems. Here’s a secret: I had to learn the engine myself before I started working on it, but I had the advantage of an office full of experts. I’m like a kobold scurrying in the depths of the earth, surrounded by jewels, while the people of the surface world go back and forth unaware of the riches beneath their feet. These articles were designed to bring some of those jewels into the light. If you want more, all you have to do is ask.

Benjamin D. Richards

Share the joy

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code class="" title="" data-url=""> <del datetime=""> <em> <i> <q cite=""> <strike> <strong> <pre class="" title="" data-url=""> <span class="" title="" data-url="">