Advanced 3D Graphics - iOS Game Development Cookbook (2014)

iOS Game Development Cookbook (2014)

Chapter 10. Advanced 3D Graphics

OpenGL provides a huge amount of flexibility in how objects get rendered, and you can do all kinds of interesting things with light and material. In this chapter, we’re going to look at shaders, which are small programs that give you complete control over how OpenGL should draw your 3D scene. We’ll look at lighting, texturing, bump-mapping, and non-photorealistic rendering.

This chapter builds on beginning and intermediate concepts covered in Chapters 8 and 9, respectively.

Understanding Shaders

Problem

You want to create shader programs, which control how objects are drawn onscreen, so that you can create different kinds of materials.

Solution

A shader is comprised of three elements: a vertex shader, a fragment shader, and a shader program that links the vertex and fragment shaders together. To make a shader, you first write the vertex and fragment shaders, then load them in to OpenGL, and then tell OpenGL when you want to use them.

First, create the vertex shader. Create a file called SimpleVertexShader.vsh, and add it to your project:

uniform mat4 modelViewMatrix;

uniform mat4 projectionMatrix;

attribute vec3 position;

void main()

{

// "position" is in model space. We need to convert it to camera space by

// multiplying it by the modelViewProjection matrix.

gl_Position = (projectionMatrix* modelViewMatrix) * vec4(position,1.0);

}

NOTE

When you drag and drop the file into your project, Xcode won’t add it to the list of files that get copied into the app’s resources. Instead, it will add it to the list of files that should be compiled. You don’t want this.

To fix it, open the project build phases by clicking on the project at the top of the Project Navigator and clicking Build Phases, and move the file from the Compile Sources to the Copy Bundle Resources list, by dragging and dropping it.

Next, create the fragment shader by creating a file called SimpleFragmentShader.fsh, and putting the following code in it:

void main()

{

// All pixels in this object will be pure red

gl_FragColor = vec4(1.0, 0.0, 0.0, 1.0);

}

Finally, provide your shader code to OpenGL. You do this by first creating two shaders, using the glCreateShader function. You then give each shader its source code using the glShaderSource function, and then compile the shader with the glCompileShader function. This needs to be done twice, once for each of the two shaders. Note that you’ll need to keep the GLuint variable around in your Objective-C code (not the shader code):

// Keep this variable around as an instance variable

GLuint _shaderProgram;

// Compile the vertex shader

NSString* vertexSource = ... // an NSString containing the source code of your

// vertex shader. Load this string from a file.

GLUint _vertexShader = glCreateShader(GL_VERTEX_SHADER);

const char* vertexShaderSourceString =

[vertexSource cStringUsingEncoding:NSUTF8StringEncoding];

glShaderSource(_vertexShader, 1, &vertexShaderSourceString, NULL);

glCompileShader(_vertexShader);

// Compile the fragment shader

NSString* fragmentSource = ... // contains the fragment shader's source code

GLuint _fragmentShader = glCreateShader(GL_FRAGMENT_SHADER);

const char* fragmentShaderSourceString =

[fragmentSource cStringUsingEncoding:NSUTF8StringEncoding];

glShaderSource(_fragmentShader, 1, &fragmentShaderSourceString, NULL);

glCompileShader(_fragmentShader);

After you give the shader source to OpenGL, you then need to check to see if there were any errors. You do this by creating a variable called success and giving its address to the glGetShaderiv function, asking it for the shader’s compile status. After the function returns, success will contain a 1 if it succeeded, and a 0 if it didn’t. If a shader didn’t succeed, you can get information on what went wrong using the glGetShaderInfoLog function:

// Check to see if both shaders compiled

int success;

// Check the vertex shader

glGetShaderiv(_vertexShader, GL_COMPILE_STATUS, &success);

if (success == 0) {

char errorLog[1024];

glGetShaderInfoLog(_vertexShader, sizeof(errorLog), NULL, errorLog);}

NSLog(@"Error: %s");

return;

}

glGetShaderiv(_fragmentShader, GL_COMPILE_STATUS, &success);

if (success == 0) {

char errorLog[1024];

glGetShaderInfoLog(_fragmentShader, sizeof(errorLog), NULL, errorLog);}

NSLog(@"Error: %s");

return;

}

Once the shaders have been checked, you then create the shader program, which links the shaders together. You do this with the glCreateProgram function, and then attach the two shaders with the glAttachShader function:

_shaderProgram = glCreateProgram();

glAttachShader(_shaderProgram, _vertexShader);

glAttachShader(_shaderProgram, _fragmentShader);

Next, we need to tell OpenGL how we want to pass information to the shader. We do this by creating a variable and setting it to 1, and then instructing OpenGL that we want to refer to a specific shader variable by this number. When you have multiple variables in a shader that you want to work with this way, you create another variable and set it to 2, and so on:

const MaterialAttributePosition = 1;

// Tell OpenGL that we want to refer to the "position" variable by the number 1

glBindAttribLocation(_shaderProgram, MaterialAttributePosition, "position");

Once the program is appropriately configured, you tell OpenGL to link the program. Once it’s linked, you can check to see if there were any problems, in much the same way as when you compiled the individual shaders:

glLinkProgram(_shaderProgram);

int success;

glGetProgramiv(program, GL_LINK_STATUS, &success);

if (success == 0) {

char errorLog[1024];

glGetProgramInfoLog(program, sizeof(errorLog), NULL, errorLog);

NSLog(@"Error: %s", errorLog);

return;

}

This completes the setup process for the shader program. When you want to render using the program, you first tell OpenGL you want to use the program by calling glUseProgram, and pass information to the shader using the glVertexAttribPointer function. When glDrawElementsis called, the shader will be used to draw the objects:

glUseProgram(_shaderProgram);

glEnableVertexAttribArray(MaterialAttributePosition);

glVertexAttribPointer(MaterialAttributePosition, 3, GL_FLOAT, GL_FALSE,

sizeof(Vertex), (void*)offsetof(Vertex, position));

glDrawElements(GL_TRIANGLES, self.mesh.triangleCount * 3, GL_UNSIGNED_INT, 0);

Discussion

A shader program gives you a vast amount of control over how objects are rendered by OpenGL. Prior to OpenGL ES 2.0 (which became available in iOS 3 on the iPhone 3GS), the only way you could draw graphics was using the built-in functions available on the graphics chip. While these were useful, they didn’t allow programmers to create their own custom rendering effects. Shaders let you do that.

Shaders are small programs that run on the graphics chip. Vertex shaders receive the vertex information provided by your code, and are responsible for transforming the vertices from object space into screen space.

Once each vertex has been transformed, the graphics chip determines which pixels on the screen need to have color drawn into them, which is a process called rasterization. Once rasterization is complete, the graphics chip then runs the shader program for each pixel to determine exactly what color needs to be shown.

Even though shaders appear to have very limited responsibilities, they have tremendous amounts of power. It’s up to shaders to apply effects like lighting, cartoon effects, bump mapping, and more.

Working with Materials

Problem

You want to separate the appearance of an object from its geometry.

Solution

This solution makes use of the component architecture discussed in Creating a Component-Based Game Layout and elaborated on in Chapter 9. In this solution, we’re going to create a Material class, which loads shaders and keeps material information separate from mesh information.

Create a new class called Material, which is a subclass of GLKBaseEffect. Put the following code in Material.h:

enum MaterialAttributes {

MaterialAttributePosition,

MaterialAttributeNormal,

MaterialAttributeColor,

MaterialAttributeTextureCoordinates

};

@interface Material : GLKBaseEffect <GLKNamedEffect>

+ (Material*)effectWithVertexShaderNamed:(NSString*)vertexShaderName

fragmentShaderNamed:(NSString*)fragmentShaderName error:(NSError**)error;

- (void) prepareToDraw;

@end

Now, put the following code in Material.m. We’re going to show one method at a time, because this file is kind of big. First, the instance variables. These store information about the shader, including where to find various variables. Not all of the variables will be used at the same time:

@interface Material () {

// References to the shaders and the program

GLuint _vertexShader;

GLuint _fragmentShader;

GLuint _shaderProgram;

// Uniform locations:

// Matrices, for converting points into different coordinate spaces

GLuint _modelViewMatrixLocation;

GLuint _projectionMatrixLocation;

GLuint _normalMatrixLocation;

// Textures, for getting texture info

GLuint _texture0Location;

GLuint _texture1Location;

// Light information

GLuint _lightPositionLocation;

GLuint _lightColorLocation;

GLuint _ambientLightColorLocation;

}

// Where to find the shader files

@property (strong) NSURL* vertexShaderURL;

@property (strong) NSURL* fragmentShaderURL;

@end

Next, add the methods that create the Material objects:

// Create a material by looking for a pair of named shaders

+ (Material*)effectWithVertexShaderNamed:(NSString*)vertexShaderName

fragmentShaderNamed:(NSString*)fragmentShaderName error:(NSError**)error {

NSURL* fragmentShaderURL =

[[NSBundle mainBundle] URLForResource:fragmentShaderName

withExtension:@"fsh"];

NSURL* vertexShaderURL =

[[NSBundle mainBundle] URLForResource:vertexShaderName withExtension:@"vsh"];

return [Material effectWithVertexShader:vertexShaderURL

fragmentShader:fragmentShaderURL error:error];

}

// Create a material by loading shaders from the provided URLs.

// Return nil if the shaders can't be loaded.

+ (Material*)effectWithVertexShader:(NSURL *)vertexShaderURL

fragmentShader:(NSURL *)fragmentShaderURL error:(NSError**)error {

Material* material = [[Material alloc] init];

material.vertexShaderURL = vertexShaderURL;

material.fragmentShaderURL = fragmentShaderURL;

if ([material prepareShaderProgramWithError:error] == NO)

return nil;

return material;

}

Then, add the method that loads and prepares the shaders:

// Load and prepare the shaders. Returns YES if it succeeded, or NO otherwise.

- (BOOL)prepareShaderProgramWithError:(NSError**)error {

// Load the source code for the vertex and fragment shaders

NSString* vertexShaderSource =

[NSString stringWithContentsOfURL:self.vertexShaderURL

encoding:NSUTF8StringEncoding error:error];

if (vertexShaderSource == nil)

return NO;

NSString* fragmentShaderSource =

[NSString stringWithContentsOfURL:self.fragmentShaderURL

encoding:NSUTF8StringEncoding error:error];

if (fragmentShaderSource == nil)

return NO;

// Create and compile the vertex shader

_vertexShader = glCreateShader(GL_VERTEX_SHADER);

const char* vertexShaderSourceString =

[vertexShaderSource cStringUsingEncoding:NSUTF8StringEncoding];

glShaderSource(_vertexShader, 1, &vertexShaderSourceString, NULL);

glCompileShader(_vertexShader);

if ([self shaderIsCompiled:_vertexShader error:error] == NO)

return NO;

// Create and compile the fragment shader

_fragmentShader = glCreateShader(GL_FRAGMENT_SHADER);

const char* fragmentShaderSourceString =

[fragmentShaderSource cStringUsingEncoding:NSUTF8StringEncoding];

glShaderSource(_fragmentShader, 1, &fragmentShaderSourceString, NULL);

glCompileShader(_fragmentShader);

if ([self shaderIsCompiled:_fragmentShader error:error] == NO)

return NO;

// Both of the shaders are now compiled, so we can link them together and

// form a program

_shaderProgram = glCreateProgram();

glAttachShader(_shaderProgram, _vertexShader);

glAttachShader(_shaderProgram, _fragmentShader);

// First, we tell OpenGL what index numbers we want to use to refer to

// the various attributes. This allows us to tell OpenGL about where

// to find vertex attribute data.

glBindAttribLocation(_shaderProgram, MaterialAttributePosition, "position");

glBindAttribLocation(_shaderProgram, MaterialAttributeColor, "color");

glBindAttribLocation(_shaderProgram, MaterialAttributeNormal, "normal");

glBindAttribLocation(_shaderProgram, MaterialAttributeTextureCoordinates,

"texcoords");

// Now that we've told OpenGL how we want to refer to each attribute,

// we link the program

glLinkProgram(_shaderProgram);

if ([self programIsLinked:_shaderProgram error:error] == NO)

return NO;

// Get the locations of the uniforms

_modelViewMatrixLocation =

glGetUniformLocation(_shaderProgram, "modelViewMatrix");

_projectionMatrixLocation =

glGetUniformLocation(_shaderProgram, "projectionMatrix");

_normalMatrixLocation = glGetUniformLocation(_shaderProgram, "normalMatrix");

_texture0Location = glGetUniformLocation(_shaderProgram, "texture0");

_texture1Location = glGetUniformLocation(_shaderProgram, "texture1");

_lightPositionLocation =

glGetUniformLocation(_shaderProgram, "lightPosition");

_lightColorLocation = glGetUniformLocation(_shaderProgram, "lightColor");

_ambientLightColorLocation =

glGetUniformLocation(_shaderProgram, "ambientLightColor");

return YES;

}

This method calls a pair of error-checking methods, which check to see if the shaders and program have been correctly prepared. Add them next:

// Return YES if the shader compiled correctly, NO if it didn't

// (and put an NSError in "error")

- (BOOL)shaderIsCompiled:(GLuint)shader error:(NSError**)error {

// Ask OpenGL if the shader compiled correctly

int success;

glGetShaderiv(shader, GL_COMPILE_STATUS, &success);

// If not, find out why and send back an NSError object

if (success == 0) {

if (error != nil) {

char errorLog[1024];

glGetShaderInfoLog(shader, sizeof(errorLog), NULL, errorLog);

NSString* errorString = [NSString stringWithCString:errorLog

encoding:NSUTF8StringEncoding];

*error = [NSError errorWithDomain:@"Material"

code:NSFileReadCorruptFileError userInfo:@{@"Log":errorString}];

}

return NO;

}

return YES;

}

// Return YES if the program linked successfully, NO if it didn't

// (and put an NSError in "error")

- (BOOL) programIsLinked:(GLuint)program error:(NSError**)error {

// Ask OpenGL if the program has been successfully linked

int success;

glGetProgramiv(program, GL_LINK_STATUS, &success);

// If not, find out why and send back an NSError

if (success == 0) {

if (error != nil) {

char errorLog[1024];

glGetProgramInfoLog(program, sizeof(errorLog), NULL, errorLog);

NSString* errorString = [NSString stringWithCString:errorLog

encoding:NSUTF8StringEncoding];

*error = [NSError errorWithDomain:@"Material"

code:NSFileReadCorruptFileError

userInfo:@{NSUnderlyingErrorKey:errorString}];

}

return NO;

}

return YES;

}

The next step is to write the prepareToDraw method, which is called immediately before drawing takes place and tells OpenGL that the next drawing operation should use the shaders controlled by this Material:

// Called when the shader is about to be used

- (void)prepareToDraw {

// Select the program

glUseProgram(_shaderProgram);

// Give the model-view matrix to the shader

glUniformMatrix4fv(_modelViewMatrixLocation, 1, GL_FALSE,

self.transform.modelviewMatrix.m);

// Also give the projection matrix

glUniformMatrix4fv(_projectionMatrixLocation, 1, GL_FALSE,

self.transform.projectionMatrix.m);

// Provide the normal matrix to the shader, too

glUniformMatrix3fv(_normalMatrixLocation, 1, GL_FALSE,

self.transform.normalMatrix.m);

// If texture 0 is enabled, tell the shader where to find it

if (self.texture2d0.enabled) {

// "OpenGL, I'm now talking about texture 0."

glActiveTexture(GL_TEXTURE0);

// "Make texture 0 use the texture data that's referred to by

// self.texture2d0.name."

glBindTexture(GL_TEXTURE_2D, self.texture2d0.name);

// "Finally, tell the shader that the uniform variable "texture0"

// refers to texture 0.

glUniform1i(_texture0Location, 0);

}

// Likewise with texture 1

if (self.texture2d1.enabled) {

glActiveTexture(GL_TEXTURE1);

glBindTexture(GL_TEXTURE_2D, self.texture2d1.name);

glUniform1i(_texture1Location, 1);

}

// Pass light information into the shader, if it's enabled

if (self.light0.enabled) {

glUniform3fv(_lightPositionLocation, 1, self.light0.position.v);

glUniform4fv(_lightColorLocation, 1, self.light0.diffuseColor.v);

glUniform4fv(_ambientLightColorLocation, 1,

self.lightModelAmbientColor.v);

}

// With this set, fragments with an alpha of less than 1 will be

// semitransparent

glEnable(GL_BLEND);

glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);

glBlendEquation(GL_ADD);

}

Finally, add the dealloc method, which deletes the shaders and the shader program when the Material object is being freed:

// Delete the program and shaders, to free up resources

- (void)dealloc {

glDeleteProgram(_shaderProgram);

glDeleteShader(_fragmentShader);

glDeleteShader(_vertexShader);

}

To use a Material object, you first create one using the effectWithVertexShaderNamed:fragmentShaderNamed:error: method, by passing in the names of the shaders you want to use:

NSError* error = nil;

Material* material = [Material effectWithVertexShaderNamed:@"MyVertexShader"

fragmentShaderNamed:@"MyFragmentShader" error:&error];

if (material == nil) {

NSLog(@"Couldn't create the material: %@", error);

return nil;

}

When you’re about to draw using the Material, you provide vertex attributes in the same way as when you’re using a GLKBaseEffect, with a single difference—you use MaterialAttributePosition instead of GLKVertexAttribPosition, and so on for the other attributes:

[material prepareToDraw];

glEnableVertexAttribArray(MaterialAttributePosition);

glVertexAttribPointer(MaterialAttributePosition, 3, GL_FLOAT, GL_FALSE,

sizeof(Vertex), (void*)offsetof(Vertex, position));

glDrawElements(GL_TRIANGLES, self.mesh.triangleCount * 3, GL_UNSIGNED_INT, 0);

NOTE

“Material” is basically just a a fancy word for a collection of properties and shaders.

Discussion

A Material object is useful for acting as a container for your shaders. As a subclass of GLKBaseEffect, your Material class is easily able to store material information like light color and where to find transforms.

The Material class presented here actually has fewer features than GLKBaseEffect, but it gives you more control. GLKBaseEffect works by dynamically creating shaders based on the parameters you supply, which means that you can’t take the base effect and add stuff on top. If you want to do more advanced rendering, you have to do it yourself—which means, among other things, writing your own shaders.

Texturing with Shaders

Problem

You want to apply textures to your objects, using shaders you’ve written.

Solution

Write a fragment shader that looks like this:

varying lowp vec4 vertex_color;

varying lowp vec2 vertex_texcoords;

uniform sampler2D texture0;

void main()

{

gl_FragColor = texture2D(texture0, vertex_texcoords) * vertex_color;

}

Discussion

A sampler2D is an object that lets you get access to texture information provided by your app. When you call the texture2D function and pass in the sampler and the texture coordinates you want to sample, you get back a four-dimensional vector that contains the red, green, blue, and alpha components at that point in the texture.

By multiplying this color with the vertex color, you can then tint the texture.

Finally, the result is then assigned to gl_FragColor, which means that OpenGL uses that color for the pixel.

Lighting a Scene

Problem

You want your objects to appear lit by light sources.

Solution

In this solution, we’ll cover point lights, which are lights that exist at a single point in space and radiate light in all directions.

To work with lights, your mesh needs to have normals. A normal is a vector that indicates the direction that a vertex is facing, which is necessary for calculating the angle at which light is going to bounce off the surface.

If you’re using the Mesh class described in Loading a Mesh, you can add normals ("nx":0, "ny":0, "nz":1 in the following example) to your mesh by adding additional info to your vertices:

{

"x":-1, "y":-1, "z":1,

"r":1, "g":0, "b":0,

"s":0, "t":1,

"nx":0, "ny":0, "nz":1

},

Next, use this vertex shader:

uniform mediump mat4 modelViewMatrix;

uniform mediump mat4 projectionMatrix;

uniform mediump mat3 normalMatrix;

attribute vec3 position;

attribute vec4 color;

attribute vec3 normal;

attribute vec2 texcoords;

varying mediump vec4 vertex_position;

varying mediump vec4 vertex_color;

varying mediump vec2 vertex_texcoords;

varying mediump vec4 vertex_normal;

void main()

{

// "position" is in model space. We need to convert it to camera space by

// multiplying it by the modelViewProjection matrix.

gl_Position = (projectionMatrix* modelViewMatrix) * vec4(position,1.0);

// Pass the color and position of the vertex in world space to the

// fragment shader

vertex_color = color;

vertex_position = modelViewMatrix * vec4(position, 1.0);

// Also pass the normal and the texture coordinates to the fragment shader

vertex_normal = vec4(normal, 0.0);

vertex_texcoords = texcoords;

}

Finally, use this fragment shader:

uniform mediump mat4 modelViewMatrix;

uniform mediump mat3 normalMatrix;

varying mediump vec4 vertex_color;

varying mediump vec2 vertex_texcoords;

varying mediump vec4 vertex_normal;

varying mediump vec4 vertex_position;

uniform sampler2D texture0;

uniform lowp vec3 lightPosition;

uniform lowp vec4 lightColor;

uniform lowp vec4 ambientLightColor;

void main()

{

// Get the normal supplied by the vertex shader

mediump vec3 normal = vec3(normalize(vertex_normal));

// Convert the normal from object space to world space

normal = normalMatrix * normal;

// Get the position of this fragment

mediump vec3 modelViewVertex = vec3(modelViewMatrix * vertex_position);

// Determine the direction of the fragment from the point on the surface

mediump vec3 lightVector = normalize(lightPosition - modelViewVertex);

// Calculate how much light is reflected

mediump float diffuse = clamp(dot(normal, lightVector), 0.0, 1.0);

// Combine everything together!

gl_FragColor = texture2D(texture0, vertex_texcoords) * vertex_color *

diffuse * lightColor + ambientLightColor;

}

Discussion

To calculate how much light is bouncing off the surface and into the camera, you first need to know the direction in which the surface is oriented. This is done using normals, which are vectors that indicate the direction of the vertices that make up the surface.

Next, you need to know the angle from the camera to the light source. For this you need to know where the light source is in world space, and where each point that light is bouncing off of is in world space. You determine this by having the vertex shader convert the position of each vertex into world space by multiplying the position by the model-view matrix.

Once this is done, the vertex shader passes the normal information and vertex colors to the fragment shader. The fragment shader then does the following things:

1. It ensures that the normal has length 1 by normalizing it, which is important for the following calculations.

2. It converts the normal into world space by multiplying it with the normal matrix, which has been supplied by the Material.

3. It converts the position of the fragment into world space by multiplying the vertex position, which was provided by the vertex shader, with the model-view matrix.

4. It then determines the vector that represents the light source’s position relative to the fragment’s position.

5. Once that’s done, it takes the dot product between the normal and the light vector. The result is how much light is bouncing off the surface and into the camera.

6. Finally, all of the information is combined together. The texture color, vertex color, light color, and how much light is hitting the surface are all multiplied together, and the ambient light is added.

Using Normal Mapping

Problem

You want to use normal mapping to make your objects appear to have lots of detail.

Solution

First, create a normal map. They are textures that represent the bumpiness of your object. Normal maps can be made using a number of third-party tools; one that we find pretty handy is CrazyBump.

Once you have your normal map, you provide a vertex shader:

uniform mediump mat4 modelViewMatrix;

uniform mediump mat4 projectionMatrix;

uniform mediump mat3 normalMatrix;

attribute vec3 position;

attribute vec4 color;

attribute vec3 normal;

attribute vec2 texcoords;

varying mediump vec4 vertex_color;

varying mediump vec2 vertex_texcoords;

varying mediump vec4 vertex_normal;

varying mediump vec4 vertex_position;

void main()

{

// "position" is in model space. We need to convert it to camera space by

// multiplying it by the modelViewProjection matrix.

gl_Position = (projectionMatrix* modelViewMatrix) * vec4(position,1.0);

// Next, we pass the color, position, normal, and texture coordinates

// to the fragment shader by putting them in varying variables.

vertex_color = color;

vertex_position = modelViewMatrix * vec4(position, 1.0);

vertex_normal = vec4(normal, 0.0);

vertex_texcoords = texcoords;

}

and a fragment shader:

uniform mediump mat4 modelViewMatrix;

uniform mediump mat3 normalMatrix;

varying mediump vec4 vertex_color;

varying mediump vec2 vertex_texcoords;

varying mediump vec4 vertex_normal;

varying mediump vec4 vertex_position;

uniform sampler2D texture0; // diffuse map

uniform sampler2D texture1; // normal map

uniform lowp vec3 lightPosition;

uniform lowp vec4 lightColor;

uniform lowp vec4 ambientLightColor;

void main()

{

// When normal mapping, normals don't come from the vertices, but rather

// from the normal map

mediump vec3 normal =

normalize(texture2D(texture1, vertex_texcoords).rgb * 2.0 - 1.0);

// Convert the normal from object space to world space.

normal = normalMatrix * normal;

// Get the position of this fragment.

mediump vec3 modelViewVertex = vec3(modelViewMatrix * vertex_position);

// Determine the direction of the fragment to the point on the surface

mediump vec3 lightVector = normalize(lightPosition - modelViewVertex);

// Calculate how much light is reflected

mediump float diffuse = clamp(dot(normal, lightVector), 0.0, 1.0);

// Combine everything together!

gl_FragColor = texture2D(texture0, vertex_texcoords) * vertex_color *

diffuse * lightColor + ambientLightColor;

}

When you create the Material using this shader, you need to provide two textures. The first is the diffuse map, which provides the base color of the surface. The second is the normal map, which is not shown to the player but is used to calculate light reflections.

Discussion

In Lighting a Scene, the normals came from the vertices. However, this generally means that there’s not a lot of detail that the light can bounce off of, and the resulting surfaces look fairly flat.

When you use a normal map, the normals come from a texture, not from individual vertices. This means that you have a lot of control over how light bounces off the surface, and consequently can make the surface appear to have more detail.

The actual algorithm for lighting a normal-mapped surface is the same as that for lighting a non-normal-mapped surface. The only difference is that the normals come from the texture, instead of being passed in by the vertex shader.

Note that normal mapping doesn’t actually make your object bumpier; it just reflects light as if it were bumpier. If you look at a normal-mapped surface side-on, it will be completely flat.

Making Objects Transparent

Problem

You want your objects to be transparent, so that objects behind them can be partly visible.

Solution

Before drawing an object, use the glBlendFunc function to control how the object you’re about to draw is blended with the scene:

glEnable(GL_BLEND);

glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);

Discussion

When the fragment shader produces the color value of a pixel, that color is blended with whatever’s already been drawn. By default, the output color replaces whatever was previously drawn, but this doesn’t have to be the case.

When you call glEnable(GL_BLEND), OpenGL will blend the output color with the scene based on instructions that you provide. The specific way that the blending takes place is up to you, and you control it using the glBlendFunc function.

glBlendFunc takes two parameters. The first is how the source color is changed as part of the blend operation, and the second is how the destination color is changed. In this context, “source color” means the color that’s emitted by the fragment shader, and “destination color” means the color that was already in the scene when the drawing took place.

By default, the blending function is this:

glBlendFunc(GL_ONE, GL_ZERO);

This means that the blending works like this:

Result Color = Source Color * 1 + Destination Color * 0;

In this case, because the destination is being multiplied by zero, it contributes nothing to the result color, and the source color completely replaces it.

A common blending method in games is to use the alpha value of the color to determine transparency: that is, 0 alpha means invisible, and 1 alpha means completely opaque. To make this happen, use glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA), which has this effect:

Result Color = Source Color * Source Alpha +

Destination Color * (1 - Source Alpha)

Doing this means that the higher the alpha value for the source color is, the more it will contribute to the final color.

Another common blending mode is “additive” blending, which creates a glowing appearance. You can create this effect by calling glBlendFunc(GL_ONE, GL_ONE), which adds the two colors together:

Result Color = Source Color * 1 + Destination Color * 1

NOTE

Additive blending is referred to as “linear dodge” in graphics programs like Adobe Photoshop.

Adding Specular Highlights

Problem

You want to have shiny specular highlights on your objects.

Solution

You can use the same vertex shader as seen in Lighting a Scene. However, to get specular highlights, you need a different fragment shader:

uniform mediump mat4 modelViewMatrix;

uniform mediump mat3 normalMatrix;

varying mediump vec4 vertex_color;

varying mediump vec2 vertex_texcoords;

varying mediump vec4 vertex_normal;

varying mediump vec4 vertex_position;

uniform sampler2D texture0; // diffuse map

uniform sampler2D texture1; // normal map

uniform lowp vec3 lightPosition;

uniform lowp vec4 lightColor;

uniform lowp vec4 ambientLightColor;

void main()

{

mediump float shininess = 2.0;

// When normal mapping, normals don't come from the vertices, but rather

// from the normal map

mediump vec3 normal =

normalize(texture2D(texture1, vertex_texcoords).rgb * 2.0 - 1.0);

// Convert the normal from object space to world space

normal = normalMatrix * normal;

// Get the position of this fragment

mediump vec3 modelViewVertex = vec3(modelViewMatrix * vertex_position);

// Determine the direction of the fragment to the point on the surface

mediump vec3 lightVector = normalize(lightPosition - modelViewVertex);

// Calculate how much light is reflected

mediump float diffuse = clamp(dot(normal, lightVector), 0.0, 1.0);

// Determine the specular term

mediump float specular = max(pow(dot(normal, lightVector), shininess), 0.0);

// Combine everything together!

gl_FragColor = texture2D(texture0, vertex_texcoords) * vertex_color *

(diffuse * lightColor) + (lightColor * specular) + ambientLightColor;

}

Discussion

Specular highlights are bright spots that appear on very shiny objects. Specular highlights get added on top of the existing diffuse and ambient light, which make them look bright.

In the real world, no object is uniformly shiny. If you want something to look slightly old and tarnished, use specular mapping: create a texture in which white is completely shiny and black is not shiny at all. In your shader, sample this texture (in the same way as when you sample a texture for color or for normals) and multiply the result by the specular term. The result will be an object that is shiny in some places and dull in others, as shown in Figure 10-1.

Using specular highlights to create shiny objects

Figure 10-1. Using specular highlights to create shiny objects

If you want to learn more about lighting, we recommend starting with the Wikipedia article on Phong shading.

Adding Toon Shading

Problem

You want to create a cartoon effect by making your object’s lighting look flat.

Solution

Add the following code to your fragment shader:

diffuse = ... // diffuse is calculated as per Recipe

// Group the lighting into three bands: one with diffuse lighting at 1.0,

// another at 0.75, and another at 0.5. Because there's no smooth

// transition between each band, hard lines will be visible between them.

if (diffuse > 0.75)

diffuse = 1.0;

else if (diffuse > 0.5)

diffuse = 0.75;

else

diffuse = 0.5;

// Use diffuse in the shading process as per normal.

Discussion

Toon shading is an example of non-photorealistic rendering, in that the colors that are calculated by the fragment shader are not the same as those that you would see in real life.

To create a cartoon-like effect, the colors emitted should have hard edges, and fade smoothly from one area to another. This can be achieved in a fragment shader by reducing the possible values of lights to a small number. In this solution’s example, the diffuse light is kept either at 0.5, 0.75, or 1.0. As a result, the final rendered color has hard, bold edges (see Figure 10-2).

Toon shading

Figure 10-2. Toon shading