Hello, and welcome to the seventh – and last – in a series of in-depth articles about Kiwi.js v1.1.0, now available for use at www.kiwijs.org. Today we’re talking about WebGL shaders and how they can do amazing things. I’m Ben Richards, and I’ve put a lot of effort into bringing powerful shaders to Kiwi.js.
This is a complex topic. It’s going to be quite long, and involve a fair amount of code. Before you begin, you should be familiar with the following topics covered in previous articles:
You should also be familiar with the procedure for creating a Kiwi.js plugin. You will need to do a fair amount of research on your own to fully understand the topic, but this should be enough to get you started.
Who Needs to Read This Post?
Advanced users. If you want to create WebGL shaders, this is where it all comes together. The article is targeted to Kiwi.js users, and depends to some extent on the Kiwi.js infrastructure. However, the principles may also be useful for other projects. If you just want to use advanced shaders that somebody else has coded, things are much simpler – they should come with instructions.
Elements of a Shader
In general terms, a WebGL shader comes in two parts: a vertex shader and a fragment shader. Together these make a program; in Kiwi.js, we store them as a shader pair. Information is passed into shaders as uniforms.
Vertex Shader
This code performs operations on the vertices of an object. These are the corners of objects; in Kiwi.js, this is usually the corners of a square. Vertex shaders are generally used to control position, movement, and generic lighting.
Fragment Shader
This code performs operations on fragments. A fragment is essentially a single pixel. It inherits information from the vertex shader. Fragments can control colour, texture, advanced lighting, and more. Most advanced trickery occurs in fragment shaders.
Shader Pair – Program
The graphics card renders using a vertex shader and a fragment shader. These are compiled and used together. In Kiwi.js, we store the program in a shader pair. The basic shader pair is Kiwi.Shaders.ShaderPair
, which provides certain key functions used under the hood. It also stores the vertex and fragment shaders.
A uniform is a variable sent to a shader program. The vertex and fragment shaders both have their own uniforms. A uniform can be one of several types, determined within the shader. It is called “uniform” because it does not change during the time the frame renders; it is always the same value for every vertex and pixel being rendered. You can change the uniform once you’re done.
Creating a Shader Plugin
Today we’re going to create a shader. There are infinitely many possibilities available. I’ve decided to create a “ripple shader”, a system that distorts the texture being drawn as though it were reflected in water.
Groundwork
Before we can even begin to work on our shader, we need to be able to see it. We must apply the shader to an object. This requires a little bit of work. We’re going to need a custom renderer, a multi-texture atlas, and a couple of stages of setup.
Custom Renderer
Shader pairs live on Renderer objects. You need a unique Renderer to separate your effects from the rest of the scene. In most cases, you’ll also need a renderer to set unique uniforms on the shader pair.
While is is possible to extend Kiwi.Renderers.Renderer
(the basic renderer prototype), the great majority of shaders will be designed to work with textured quadrilateral objects – Kiwi.js assets. For this reason, it’s usually better to extend Kiwi.Renderers.TextureAtlasRenderer
, the default renderer, designed to do most of the heavy lifting for you.
Most of the TextureAtlasRenderer code is already fit for purpose; the only place where unique uniforms are discussed is in TextureAtlasRenderer.enable()
. This should be the only piece of code you need to alter or extend.
As part of your plugin, you should create an extended renderer. This should be in the Kiwi.Renderers
namespace so that it can be recognised by the engine.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30
|
// // Renderer Boilerplate // Kiwi.Renderers.RippleShaderRenderer = function( gl, shaderManager, params ) { // Perform super functionality Kiwi.Renderers.TextureAtlasRenderer.call( this, gl, shaderManager, params ); // Custom functionality /* Your custom code goes here */ } // Extend renderer Kiwi.extend( Kiwi.Renderers.RippleShaderRenderer, Kiwi.Renderers.TextureAtlasRenderer ); // // End Boilerplate // // Extend functionality Kiwi.Renderers.RippleShaderRenderer.prototype.enable = function( gl, params ) { // Boilerplate extension Kiwi.Renderers.TextureAtlasRenderer.prototype.enable.call( this, gl, params ); // Custom functionality /* Your custom uniform code goes here */ } |
We will revisit the custom code later, once we understand what kind of uniforms we need.
Multi-Texture Atlas
Most shaders will require a multi-texture atlas, as one image simply can’t contain enough data to fully describe a complex situation. We’ve expanded the texture pipeline to support a multi-texture plugin. For this example we’ll include multi-texture functionality with the shader plugin itself.
We hope to make a formal multi-texture plugin available very soon. It is best practice to make such a fundamental tool available independent of however many shaders you are using. We shouldn’t really be including it in the same plugin. Try not to do things this way – it will save you time and worry in the long term.
Anyway, here’s a multi-texture atlas:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53
|
Kiwi.Textures.MultiTextureAtlas = function( name, type, cells, images, sequences ) { // Preprocess images array this.images = images; var image = images[0]; // Create a default texture: this is passed to the super call // Perform super functionality Kiwi.Textures.TextureAtlas.call( this, name, type, cells, image, sequences ); } Kiwi.extend( Kiwi.Textures.MultiTextureAtlas, Kiwi.Textures.TextureAtlas ); // Extend functionality Kiwi.Textures.MultiTextureAtlas.prototype.createGLTextureWrapper = function( gl, textureManager ) { // Super call: create default wrapper Kiwi.Textures.TextureAtlas.prototype.createGLTextureWrapper.call( this, gl, textureManager); // Create additional wrappers, add to a list, and assign them correct values this.glTextureWrappers = [this.glTextureWrapper]; for( var i = 1; i < this.images.length; i++ ) { var wrapper = new Kiwi.Renderers.GLTextureWrapper(gl, this); wrapper.image = this.images[i]; this.glTextureWrappers.push( wrapper ); textureManager.registerTextureWrapper( gl, wrapper ); } } Kiwi.Textures.MultiTextureAtlas.prototype.enableGL = function( gl, renderer, textureManager ) { // This function replaces the template version, so it doesn't call super // We have copied some functionality from TextureAtlas // Set resolution uniforms renderer.updateTextureSize(gl, new Float32Array([this.image.width, this.image.height])); // Upload texture // This ensures that all necessary texture units are in use for( var i = 0; i < this.glTextureWrappers.length; i++ ) { textureManager.useTexture(gl, this.glTextureWrappers[i], i); } // If necessary, refresh the texture if(this.dirty) this.refreshTextureGL( gl ); } Kiwi.Textures.MultiTextureAtlas.prototype.refreshTextureGL = function ( gl ) { // Super call: refresh default wrapper Kiwi.Textures.TextureAtlas.prototype.refreshTextureGL( this, gl ); // Refresh additional wrappers for( var i = 1; i < this.glTextureWrappers.length; i++ ) { if(this.glTextureWrappers[i]) this.glTextureWrappers[i].refreshTexture( gl ); } } |
Game Implementation
In addition, we need to do a few more things to use these structures. This code does not belong in the plugin; it must be part of your game code itself.
First, we must have images and cells to feed to the constructor. We can do this by stealing from normal images that were loaded earlier.
Second, we must fold the source images into a list.
Third, we must actually create the MultiTextureAtlas. Note that we’ve called it multiTextureAtlas
, which is very clear in this context, and would be terrible practice if you were actually using it in a game. You should probably name it something better suited to its purpose.
Fourth, we must add the MultiTextureAtlas to the state texture library.
Finally, we must apply the atlas and the renderer to the target entity. Note that the renderer requires very little setup: it is automatically available.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25
|
// In the State.preload section: // ... this.addImage( "image1", image1.png ); this.addImage( "image2", image2.png ); // ... // In the State.create section: // ... var multiTextureImages = [myState.textures.image1.image, myState.textures.image2.image]; var multiTextureAtlas = new Kiwi.Textures.MultiTextureAtlas( "MultiTextureAtlas", Kiwi.Textures.MultiTextureAtlas.SINGLE_IMAGE, myState.textures.image1.cells, multiTextureImages, null); // Add to library this.textureLibrary.add( multiTextureAtlas ); // Apply to entity // EITHER: apply on creation this.entity = new Kiwi.GameObjects.StaticImage( this, multiTextureAtlas, 0, 0 ); this.entity.glRenderer = this.game.renderer.requestSharedRenderer( "RippleShaderRenderer" ); // OR: apply after creation /* this.entity.atlas = multiTextureAtlas; this.entity.glRenderer = this.game.renderer.requestSharedRenderer( "RippleShaderRenderer" ); */ // ... |
Note that this should have no visual consequences so far. The MultiTextureAtlas returns image1, and the RippleShaderRenderer renders it using a default set of shaders.
Let’s change that.
Make a Shader
Shaders are all about pushing data around, point by point. To do this we use GLSL (OpenGL Shader Language), a language based on C syntax. Don’t panic; it’s pretty simple.
The Shader Pair
Let’s pull out usual trick, and extend an extant prototype.
|
Kiwi.Shaders.RippleShader = function() { // Super call Kiwi.Shaders.TextureAtlasShader.call( this ); // Extended functionality // Configure uniforms /* Your code goes here */ // Configure shaders /* Your code goes here */ } Kiwi.extend( Kiwi.Shaders.RippleShader, Kiwi.Shaders.TextureAtlasShader ); |
The Vertex Shader
We’re not going to do anything special with vertex data, so we’ll just copy the vertex shader from Kiwi.Shaders.TextureAtlasShader
.
|
attribute vec4 aXYUV; attribute float aAlpha; uniform mat3 uCamMatrix; uniform vec2 uResolution; uniform vec2 uTextureSize; varying vec2 vTextureCoord; varying float vAlpha; void main(void) { vec2 pos = (uCamMatrix * vec3(aXYUV.xy,1)).xy; gl_Position = vec4((pos / uResolution * 2.0 - 1.0) * vec2(1, -1), 0, 1); vTextureCoord = aXYUV.zw / uTextureSize; vAlpha = aAlpha; } |
Here we see several important features of GLSL. Let’s step through some key aspects.
attribute
: a piece of data used to draw an object, or array of data used to draw several objects. In this case, Kiwi.js already supplies attributes regarding position and texture mapping.
uniform
: as described above, a piece of data that’s constant across all objects drawn by this shader. These are the primary way to communicate data to shaders.
varying
: a piece of data used inside the shader pipeline. Think of varying variables as messengers sent from the vertex shader to the fragment shader. You cannot access them from the outside, or send messages back the other way.
float
: a piece of data that is a floating-point number. This means it has a decimal place, for example 0.321. Most data in a shader is made up of floats.
vec4
: just like float
, except it’s four numbers. For example, myVec4 = vec4(1.0, 0.5, 0.0, 0.2)
. There are also vec3s and vec2s.
vec4.xyzw
and vec4.rgba
and vec4.stpq
are swizzles. A swizzle is a quick way of getting terms out of a vec. You must swizzle from the same set, e.g. you cannot mix rgba terminology with xyzw. However, within that set, you can access them any way you want. For example, myVec4.xz
is (1.0, 0.0).
You can directly add, multiply, divide etc. swizzled vecs together. For example, myVec4.xy * myVec4.zw
results in (0.0, 0.1). Swizzling is pretty neat.
mat3
: a 3×3 matrix. This is where the high sorcery dwells. Matrix mathematics can do amazing things if you wrap your head around it. In this case, we’re using it to do arcane trickery to map screen coordinates and camera transformation to the WebGL context coordinate system. Like I said, high sorcery.
void main(void)
: declares the actual program. Here we calculate some values, set some varyings, and set some key properties.
gl_Position
: a key property. This one sets the position of the vertex. (It happens to do so in four dimensions for computational reasons.)
Our Algorithm
Before we design our fragment shader, let’s think about what it actually has to do.
We want to create ripples. This means that parts of the image will move back and forth. We could do this by subdividing the object into hundreds of little patches and animating the vertices, but that is actually quite inefficient. Instead we’ll use a little trick: coordinate tweaking.
I happen to know that fragment shaders get colour by sampling from a texture using coordinates. All we have to do is modulate those coordinates in a regular, periodic fashion – and the texture will appear to swim back and forth.
Ah, but which way will they swim? Up and down? Side to side? What direction are the ripples travelling?
We can solve this in one of two ways. You could define a center using uniforms, then compute direction from there. In this case, because we want to explore the use of textures, we’ll use a pre-computed map instead.

This is a normal map. These are used a lot in computer graphics, often to create nifty lighting effects. Today we’ll be using the map to indicate direction instead. Consider each pixel to represent a vector, where RGB color channels map to XYZ direction values. For example, flat blue regions are (0,0,1) – a vector pointing straight up, meaning a flat surface. This is a very quick way to get information into a shader.
(Technical note: In fact, flat blue regions are (0.5, 0.5, 1.0). The normal map presents a normalised vector which could point straight down along the Z-axis, i.e. (0, 0, -1.0). Because colour space does not support negative numbers, we have to pack the range (-1, 1) into the range (0, 1). We include a conversion term in the shader, below.)
This map consists of vectors pointing away from the center. This will allow waves to move in the right direction across the texture.
Alright, but how high should the waves be at each point? They should go up-down-up-down as they radiate from the center.
As with all things in computer graphics, we’re going to fake this. We’ll take another map, a height map consisting of grayscale values. (If we were being particularly efficient, we would store this as a single channel texture. However, that would involve a lot more under-the-hood work.) This map describes the distance from the center of the object.

Now we know how far we are from the center, and what direction the waves are going. What now? We do one of my favourite things: apply a sine wave.
A sine wave goes up and down as the value increases. We’ll feed the distance from the height map into a sine wave, so as we move away from the middle, the waves go up and down.
We’ll also need a couple of extra parameters in the sine function:
- A = Amplitude, how tall the waves are.
- F = Frequency, how long the waves are.
- P = Phase, how far along the waves are. We should animate this to cause the waves to move over time.
Now we have enough information to animate an offset, causing our fragment shader to seek texture information from changing parts of the texture. The rough steps are:
- Obtain direction from texture.
- Obtain distance from texture.
- Compute wave
amplitude * sin( distance * frequency + phase )
.
- Compute offset by multiplying direction.xy by computed wave.
- Add offset to standard texture coordinates.
- Draw!
Making Normal Maps
You don’t ususally make normal maps by hand. Instead, you compute them from other, more intuitive forms of data. In this case I actually did things backwards: I made the height map first, then converted it to a normal map.
There are several tools available for normal map conversion. I used a program called CrazyBump. You can download normal map tools for Photoshop from NVidia. You can also get good results from an online tool called NormalMap-Online.
The Fragment Shader
So let’s put all that into a shader. To start with, we’ll just copy the TextureAtlasShader fragment shader:
|
precision mediump float; varying vec2 vTextureCoord; varying float vAlpha; uniform sampler2D uSampler; void main(void) { gl_FragColor = texture2D(uSampler, vec2(vTextureCoord.x, vTextureCoord.y)); gl_FragColor.a *= vAlpha; } |
Note that this has a couple of additional features.
precision mediump float
: requests that the video card uses a certain amount of precision. WebGL allows devices to have high precision, but it doesn’t require anything more than medium. Bear this in mind if you’re developing for multiple platforms: some may not support high precision.
varying
: these are receiving messages from the vertex shader.
sampler2D
: this is a texture.
texture2D
: this is a texture lookup. It takes a sampler2D and an xy coordinate, and returns the color of the texture at that coordinate. Note that the coordinates are in GL space, which is always 0-1.
gl_FragColor
: this is the final output, the RGBA color of the pixel in question. Set this to make the video card draw a color.
So this default fragment shader just receives texture coordinates from the vertex shader, looks them up on a texture map, and draws that texture to the screen.
Adding Textures
For starters, we need three textures at once: the diffuse, the normal, and the height. (Diffuse refers to a flat texture without any particular lighting information. It represents a surface illuminated by a diffuse light.)
We’re going to create some more sampler2D
s. Our uniforms in the shader now read:
|
uniform sampler2D uSampler; uniform sampler2D uNormalSampler; uniform sampler2D uHeightSampler; |
We are leaving the diffuse channel as uSampler, because this is called out by the TextureAtlasRenderer and we don’t want to change the way it works.
Adding Sine Waves
Now we’ll add some controls for sculpting the ripples. We need three more values: amplitude, frequency, and phase.
|
uniform float uWaveAmplitude; uniform float uWaveFrequency; uniform float uWavePhase; |
Now we need to calculate the offset. Remember the steps we laid out when we were designing the algorithm:
- Obtain direction from texture.
- Obtain distance from texture.
- Compute wave
amplitude * sin( distance * frequency + phase )
.
- Compute offset by multiplying direction.xy by computed wave.
- Add offset to standard texture coordinates.
- Draw!
This translates to the following code:
|
void main(void) { // Get offset data vec4 normalCol = texture2D(uNormalSampler, vec2(vTextureCoord.x, vTextureCoord.y)); vec4 heightCol = texture2D(uHeightSampler, vec2(vTextureCoord.x, vTextureCoord.y)); // Because normal textures are in the range (0,1), // and should be in the range (-1,1), we should remap them: normalCol = normalCol * vec4(2,2,2,1) + vec4(-1,-1,-1,0); // Compute offset data using one channel from height map float wave = uWaveAmplitude * sin(heightCol.x * uWaveFrequency + uWavePhase); vec2 offset = normalCol.xy * wave; // Get final color gl_FragColor = texture2D(uSampler, vec2(vTextureCoord.x + offset.x, vTextureCoord.y + offset.y)); gl_FragColor.a *= vAlpha; } |
Adding Shaders to ShaderPairs
The video card actually expects these shaders as strings. The ShaderPair object holds those strings in two variables called vertSource
and fragSource
. Break up your shader code into lines, add them to an array, and assign them within the “configure shaders” step in your ShaderPair code.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19
|
this.vertSource = [ "attribute vec4 aXYUV;", "attribute float aAlpha;", "uniform mat3 uCamMatrix;", "uniform vec2 uResolution;", "uniform vec2 uTextureSize;", "varying vec2 vTextureCoord;", "varying float vAlpha;", "void main(void) {", " vec2 pos = (uCamMatrix * vec3(aXYUV.xy,1)).xy; ", " gl_Position = vec4((pos / uResolution * 2.0 - 1.0) * vec2(1, -1), 0, 1);", " vTextureCoord = aXYUV.zw / uTextureSize;", " vAlpha = aAlpha;", "}" ]; //this.fragSource = ... // An exercise for the reader. |
Kiwi.js will take care of the rest, compiling the shaders and making sure they’re appied to the correct drawing operations.
If you tried this now, it wouldn’t work. We need to actually set those new uniforms before the shader will function.
Let’s start with the floats, because they’re simplest. The TextureAtlasShader prototype has already created and populated an object called uniforms
. You just need to add new properties to this object within the “configure uniforms” stage of your ShaderPair code. Each uniform must be registered as an object containing a type.
|
this.uniforms.uWaveAmplitude = { type: "1f" }; this.uniforms.uWaveFrequency = { type: "1f" }; this.uniforms.uWavePhase = { type: "1f" }; |
As you’ll note, we’re declaring that these are “type 1f”. What does this mean? It means they are floats with 1 value. WebGL uses vectors so often that it thinks of single numbers as a special case of several numbers: a collection with 1 member. If you look at the source code for Kiwi.js, you’ll find that TextureAtlasShader also uses mat3, 2fv, and 1i. These represent a 3×3 matrix, a vector with 2 floats in a special array, and a solo integer. There are numerous data types available, and you should spend some time researching these.
After telling the game that these uniforms exist, we must set them. Remember way back when, we left a space for uniforms in our renderer? Now we’re going to fill it in. We’re also going to set some properties on the renderer itself, to give us better control over the shader without having to edit the shader code itself:
|
Kiwi.Renderers.RippleShaderRenderer = function( gl, shaderManager, params ) { // Perform super functionality Kiwi.Renderers.TextureAtlasRenderer.call( this, gl, shaderManager, params ); // Custom functionality this.amplitude = 0.1; this.frequency = 40; this.phase = 0; } // Extend renderer Kiwi.extend( Kiwi.Renderers.RippleShaderRenderer, Kiwi.Renderers.TextureAtlasRenderer ); |
|
Kiwi.Renderers.RippleShaderRenderer.prototype.enable = function( gl, params ) { // Boilerplate extension Kiwi.Renderers.TextureAtlasRenderer.prototype.enable.call( this, gl, params ); // Custom functionality gl.uniform1f( this.shaderPair.uniforms.uWaveAmplitude.location, this.amplitude ); gl.uniform1f( this.shaderPair.uniforms.uWaveFrequency.location, this.frequency ); gl.uniform1f( this.shaderPair.uniforms.uWavePhase.location, this.phase ); } |
Setting Uniform Textures
Now let’s get the textures into the shader.
First, we need to make sure that the MultiTextureAtlas is loaded with all three textures, and in a known order. We’ll order them [diffuse, normal, height], so we know that the MultiTextureAtlas will load diffuse into texture unit 0, normal into texture unit 1, and height into texture unit 2.
|
// In the State.create section: // ... this.addImage( "characterDiffuse", character_diffuse.png ); this.addImage( "characterNormal", character_normal.png ); this.addImage( "characterHeight", character_height.png ); // ... // In the State.create section: // ... var multiTextureImages = [myState.textures.characterDiffuse.image, myState.textures.characterNormal.image, myState.textures.characterHeight.image]; // ... |
Now that we know these textures are in specific texture units, we need simply set the uniforms in the ShaderPair to look at those numbered units. The diffuse texture is already set to unit 0 by the default TextureAtlasRenderer. We need simply assign texture units to the other uniforms:
|
//... gl.uniform1i( this.shaderPair.uniforms.uNormalSampler.location, 1 ); gl.uniform1i( this.shaderPair.uniforms.uHeightSampler.location, 2 ); //... |
And, of course, declare the uniforms in the ShaderPair itself:
|
this.uniforms.uNormalSampler = { type: "1i" }; this.uniforms.uHeightSampler = { type: "1i" }; |
And that’s it – our uniforms are all set.
Apply Shader Pair
Finally, we want to apply the ShaderPair to the renderer. We do this by setting a string in the renderer:
|
Kiwi.Renderers.RippleShaderRenderer = function( gl, shaderManager, params ) { // Perform super functionality Kiwi.Renderers.TextureAtlasRenderer.call( this, gl, shaderManager, params ); // Custom functionality this.amplitude = 0.1; this.frequency = 40; this.phase = 0; this.setShaderPair( "RippleShader" ); } // Extend renderer Kiwi.extend( Kiwi.Renderers.RippleShaderRenderer, Kiwi.Renderers.TextureAtlasRenderer ); |
Final Plugin
Now the plugin is ready. Let’s review the final code. Here’s the plugin:
ripple_shader_plugin_v1.0.0.js:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160
|
/* * Declare plugin */ Kiwi.Plugins.RippleShader = { name:'RippleShader', version:'1.0.0', minimumKiwiVersion:'1.1.0', pluginDependencies: [ { } ] }; Kiwi.PluginManager.register(Kiwi.Plugins.RippleShader); /* * Ripple Shader Renderer */ Kiwi.Renderers.RippleShaderRenderer = function( gl, shaderManager, params ) { // Perform super functionality Kiwi.Renderers.TextureAtlasRenderer.call( this, gl, shaderManager, params ); // Custom functionality this.amplitude = 0.1; this.frequency = 40; this.phase = 0; this.setShaderPair( "RippleShader" ); } // Extend renderer Kiwi.extend( Kiwi.Renderers.RippleShaderRenderer, Kiwi.Renderers.TextureAtlasRenderer ); Kiwi.Renderers.RippleShaderRenderer.prototype.enable = function( gl, params ) { // Boilerplate extension Kiwi.Renderers.TextureAtlasRenderer.prototype.enable.call( this, gl, params ); // Custom functionality gl.uniform1f( this.shaderPair.uniforms.uWaveAmplitude.location, this.amplitude ); gl.uniform1f( this.shaderPair.uniforms.uWaveFrequency.location, this.frequency ); gl.uniform1f( this.shaderPair.uniforms.uWavePhase.location, this.phase ); gl.uniform1i( this.shaderPair.uniforms.uNormalSampler.location, 1 ); gl.uniform1i( this.shaderPair.uniforms.uHeightSampler.location, 2 ); } /* * MultiTexture Atlas */ Kiwi.Textures.MultiTextureAtlas = function( name, type, cells, images, sequences ) { // Preprocess images array this.images = images; var image = images[0]; // Create a default texture: this is passed to the super call // Perform super functionality Kiwi.Textures.TextureAtlas.call( this, name, type, cells, image, sequences ); } Kiwi.extend( Kiwi.Textures.MultiTextureAtlas, Kiwi.Textures.TextureAtlas ); // Extend functionality Kiwi.Textures.MultiTextureAtlas.prototype.createGLTextureWrapper = function( gl, textureManager ) { // Super call: create default wrapper Kiwi.Textures.TextureAtlas.prototype.createGLTextureWrapper.call( this, gl, textureManager); // Create additional wrappers, add to a list, and assign them correct values this.glTextureWrappers = [this.glTextureWrapper]; for( var i = 1; i < this.images.length; i++ ) { var wrapper = new Kiwi.Renderers.GLTextureWrapper(gl, this); wrapper.image = this.images[i]; this.glTextureWrappers.push( wrapper ); textureManager.registerTextureWrapper( gl, wrapper ); } } Kiwi.Textures.MultiTextureAtlas.prototype.enableGL = function( gl, renderer, textureManager ) { // This function replaces the template version, so it doesn't call super // We have copied some functionality from TextureAtlas // Set resolution uniforms renderer.updateTextureSize(gl, new Float32Array([this.image.width, this.image.height])); // Upload texture // This ensures that all necessary texture units are in use for( var i = 0; i < this.glTextureWrappers.length; i++ ) { textureManager.useTexture(gl, this.glTextureWrappers[i], i); } // If necessary, refresh the texture if(this.dirty) this.refreshTextureGL( gl ); } Kiwi.Textures.MultiTextureAtlas.prototype.refreshTextureGL = function ( gl ) { // Super call: refresh default wrapper Kiwi.Textures.TextureAtlas.prototype.refreshTextureGL( this, gl ); // Refresh additional wrappers for( var i = 1; i < this.glTextureWrappers.length; i++ ) { if(this.glTextureWrappers[i]) this.glTextureWrappers[i].refreshTexture( gl ); } } /* * Ripple Shader Pair */ Kiwi.Shaders.RippleShader = function() { // Super call Kiwi.Shaders.TextureAtlasShader.call( this ); // Extended functionality // Configure uniforms this.uniforms.uWaveAmplitude = { type: "1f" }; this.uniforms.uWaveFrequency = { type: "1f" }; this.uniforms.uWavePhase = { type: "1f" }; this.uniforms.uNormalSampler = { type: "1i" }; this.uniforms.uHeightSampler = { type: "1i" }; // Configure shaders // vertSource is identical to template and need not be specified. this.fragSource = [ "precision mediump float;", "varying vec2 vTextureCoord;", "varying float vAlpha;", "uniform sampler2D uSampler;", "uniform sampler2D uNormalSampler;", "uniform sampler2D uHeightSampler;", "uniform float uWaveAmplitude;", "uniform float uWaveFrequency;", "uniform float uWavePhase;", "void main(void) {", " // Get offset data", " vec4 normalCol = texture2D(uNormalSampler, vec2(vTextureCoord.x, vTextureCoord.y));", " vec4 heightCol = texture2D(uHeightSampler, vec2(vTextureCoord.x, vTextureCoord.y));", " // Because normal textures are in the range (0,1),", " // and should be in the range (-1,1), we should remap them:", " normalCol = normalCol * vec4(2,2,2,1) + vec4(-1,-1,-1,0);", " // Compute offset data using one channel from height map", " float wave = uWaveAmplitude * sin(heightCol.x * uWaveFrequency + uWavePhase);", " vec2 offset = normalCol.xy * wave;", " // Get final color", " gl_FragColor = texture2D(uSampler, vec2(vTextureCoord.x + offset.x, vTextureCoord.y + offset.y));", " gl_FragColor.a *= vAlpha;", "}" ]; } Kiwi.extend( Kiwi.Shaders.RippleShader, Kiwi.Shaders.TextureAtlasShader ); |
And here’s the game code to display it:
index.html:
|
<!DOCTYPE html> <html> <head> <script type="text/javascript" src="kiwi.js"></script> <script type="text/javascript" src="ripple_shader_plugin_v1.0.0.js"></script> <script type="text/javascript" src="demoScene.js"></script> </head> <body></body> </html> |
demoScene.js:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54
|
var gameOptions = { renderer: Kiwi.RENDERER_AUTO } var demoGame = new Kiwi.Game(null, "demoGame", null, gameOptions); var demoState = new Kiwi.State('demoState'); demoState.preload = function() { Kiwi.State.prototype.preload.call(this); this.addImage( 'characterDiffuse', 'character_diffuse.png' ); this.addImage( 'characterNormal', 'character_normal.png' ); this.addImage( 'characterHeight', 'character_height.png' ); this.addImage( 'background', 'ninjaBackground.png' ); } demoState.create = function() { Kiwi.State.prototype.create.call(this); // Create multitexture var multiTextureImages = [this.textures.characterDiffuse.image, this.textures.characterNormal.image, this.textures.characterHeight.image]; var multiTextureAtlas = new Kiwi.Textures.MultiTextureAtlas( "MultiTextureAtlas", Kiwi.Textures.MultiTextureAtlas.SINGLE_IMAGE, this.textures.characterDiffuse.cells, multiTextureImages, null); // Add to library this.textureLibrary.add( multiTextureAtlas ); // Create objects and assign rendering systems this.background = new Kiwi.GameObjects.StaticImage( this, this.textures['background'], 0,0 ); this.character = new Kiwi.GameObjects.StaticImage( this, multiTextureAtlas, 400,150 ); this.character.scale = 2; this.character.glRenderer = this.game.renderer.requestSharedRenderer( "RippleShaderRenderer" ); // Add a character for contrast this.character2 = new Kiwi.GameObjects.StaticImage( this, this.textures["characterDiffuse"], 200,150 ); this.character2.scale = 2; this.addChild( this.background ); this.addChild( this.character ); this.addChild( this.character2 ); } demoState.update = function() { Kiwi.State.prototype.update.call( this ); // Animate ripple effect this.character.glRenderer.phase = this.game.idealFrame * -0.2; } demoGame.states.addState( demoState ); demoGame.states.switchState( 'demoState' ); |
And here’s what it looks like in action!
Download zipped source code here.
(If you cannot see the game, your browser is likely not compatible with WebGL standards. You can see a screenshot of what you’re missing out on here.)
Extending the Shader to Animation and Beyond
This shader has many flaws and shortcomings. For example, you’ll find that may react strangely if you try to use it with sprite sheets and animation in Kiwi.js. For the purposes of demonstration, I restricted it to a square texture with no animation. You will have to do more research if you want to bring out the full power of shaders.
Most of this is due to the fact that sprite sheets use “cells”, subdivisions of a single image. The MultiTextureAtlas assumes that all sources have the same cell layout, so if you were to try to animate our rippling shadow warrior, you’d need three sprite sheets: one for the diffuse, and one each for the normal and height maps, repeating the same basic information over and over.
If you’re interested in shaders, you will need to do a lot of research, and that’s a fact.
Cheat Sheet
Here’s a code template for you to fill in with your own crazy shaders:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67
|
/* * Declare plugin */ Kiwi.Plugins.TemplateShader = { name:'TemplateShader', version:'0.0.0', minimumKiwiVersion:'1.1.0', pluginDependencies: [ { } ] }; Kiwi.PluginManager.register(Kiwi.Plugins.TemplateShader); /* * Template Shader Renderer */ Kiwi.Renderers.TemplateShaderRenderer = function( gl, shaderManager, params ) { // Perform super functionality Kiwi.Renderers.TextureAtlasRenderer.call( this, gl, shaderManager, params ); // Custom functionality /* Put your custom code here */ this.setShaderPair( "TemplateShader" ); } // Extend renderer Kiwi.extend( Kiwi.Renderers.TemplateShaderRenderer, Kiwi.Renderers.TextureAtlasRenderer ); Kiwi.Renderers.TemplateShaderRenderer.prototype.enable = function( gl, params ) { // Boilerplate extension Kiwi.Renderers.TextureAtlasRenderer.prototype.enable.call( this, gl, params ); // Custom functionality /* Set your custom uniforms here */ } /* * Template Shader Pair */ Kiwi.Shaders.TemplateShader = function() { // Super call Kiwi.Shaders.TextureAtlasShader.call( this ); // Extended functionality // Configure uniforms /* Declare your uniforms here */ // Configure shaders /* Put your shader code here */ } Kiwi.extend( Kiwi.Shaders.TemplateShader, Kiwi.Shaders.TextureAtlasShader ); |
This template doesn’t include texturing, multitexturing, or creating and assigning renderers or shaders to game objects. You’ll have to do that on your own.
In Review
Whew! Today we learned how to create a whole shader pipeline. We’ve covered vertex shaders, fragment shaders, uniforms, GLSL and its variable types including vecs and swizzling, custom renderers, custom shader pairs, normal maps, multitextures, and a whole lot more. I sincerely hope this was useful. We put a lot of work into making this technology available to you.
This has been the last article in the launch series for Kiwi.js v1.1.0. Thanks for sticking through to the end. You haven’t heard the last of us; Kiwi.js is always growing and improving. We’re going to add some amazing new stuff to the library in the future, and I expect I’ll be back to explain it.
In the meantime, don’t hesitate to ask about any details in the forums or using our various social media systems. Here’s a secret: I had to learn the engine myself before I started working on it, but I had the advantage of an office full of experts. I’m like a kobold scurrying in the depths of the earth, surrounded by jewels, while the people of the surface world go back and forth unaware of the riches beneath their feet. These articles were designed to bring some of those jewels into the light. If you want more, all you have to do is ask.
Benjamin D. Richards
Recent Comments