Friday, April 13, 2012

Part 10 : Bump Mapping with CrazyBump


In this post, I'm going to use normal mapping to "fake" details on the cube. I will also add specularity which will make those details pop out. There's a few software products out there that generate normal maps automatically by analyzing a texture.

As it turns out, CrazyBump is pretty neat one, and as of this writing, the mac version is in public beta. When it's no longer in beta, well, you can either find another software, or, ahem... do what you have to do.

As usual, here's a link to what I've done so far: Project

I've taken the time to add a few more comments and fix some things here and there.

First things first, you're going to want to install CrazyBump. When that's done, run the program, click open and select "Open photograph from file". Then, browse to wherever the texture we've been using since the beginning is, and open it. CrazyBump can't open PVR files, so you need to find the original PNG texture. After a few seconds, you're going to be presented with a choice between two shapes. 


The dark regions represent bumps that are the deepest, so choose the right image, where the cracks are dark.

After you make your choice, you'll be presented with a 3D preview of what the normal map can do. In the preview panel, you can change the shape to a cube, so you'll have a pretty good idea of what it will look like on our cube.

In the main panel, there are a ton of sliders you can use to customize the normal map, and I encourage you to play around with them and watch the result in the preview window, because... well... it's super fun.

When you're done fooling around, click save normals in the main window, and save the texture as SquareTexture_NRM.png, which will be the default name. When you open the normal map, it should look something like this:


This, right there, is the information of the normal vectors, coded as RGB colors. What we're going to do is load this file with GLKTextureLoader, then we'll sample it in the fragment shader to do per fragment lighting calculations, instead of per vertex calculations like we did before.

But first... I want you to compress this file exactly like we compressed the texture, because I won't load a 512x512 PNG file in my program. When that's done, add SquareTexture_NRM.pvr to the project.

Now you need to load that file in the program. To do that, you'll need a new GLKTextureInfo property which you will add to your private interface:

@property (nonatomic, strong) GLKTextureInfo *normalMap;

Don't forget to synthesize your property...

You load the normal map exactly like a texture, this way:

filePath = [[NSBundle mainBundle] pathForResource:@"SquareTexture_NRM" ofType:@"pvr"];
self.normalMap = [GLKTextureLoader textureWithContentsOfFile:filePath options:nil error:&error];
if(error) {
NSLog(@"Error loading normal map from image: %@", error);
}
view raw normalMapLoad.m hosted with ❤ by GitHub

Now, when you do lighting calculations in your fragment shader, you're going to do them in the tangent space. That's because it's simply easier that way. In the tangent space, you have three vectors describing the origin: the normal (red), which we already have, the tangent (green) and the binormal (blue), which we don't have.


If you have two of them, you can calculate the third one since it's orthogonal to the other two. It's also possible to calculate the tangent using the surrounding vertices, and this is usually what I've seen being done. But, since I'm drawing a cube, and that the tangent values can be deducted easily, I'm just going to add them to the vertex data array. This means that I will have the tangent value for each vertex.

GLfloat CubeVertexData[264] =
{
// right 0
0.5f, -0.5f, -0.5f, 1.0f, 0.0f, 0.0f, 1.0f, 0.0f, 0.0f, 0.0f, -1.0f,
0.5f, 0.5f, -0.5f, 1.0f, 0.0f, 0.0f, 1.0f, 1.0f, 0.0f, 0.0f, -1.0f,
0.5f, 0.5f, 0.5f, 1.0f, 0.0f, 0.0f, 0.0f, 1.0f, 0.0f, 0.0f, -1.0f,
0.5f, -0.5f, 0.5f, 1.0f, 0.0f, 0.0f, 0.0f, 0.0f, 0.0f, 0.0f, -1.0f,
// top 4
0.5f, 0.5f, -0.5f, 0.0f, 1.0f, 0.0f, 1.0f, 1.0f, 1.0f, 0.0f, 0.0f,
-0.5f, 0.5f, -0.5f, 0.0f, 1.0f, 0.0f, 0.0f, 1.0f, 1.0f, 0.0f, 0.0f,
-0.5f, 0.5f, 0.5f, 0.0f, 1.0f, 0.0f, 0.0f, 0.0f, 1.0f, 0.0f, 0.0f,
0.5f, 0.5f, 0.5f, 0.0f, 1.0f, 0.0f, 1.0f, 0.0f, 1.0f, 0.0f, 0.0f,
// left 8
-0.5f, 0.5f, -0.5f, -1.0f, 0.0f, 0.0f, 0.0f, 1.0f, 0.0f, 0.0f, 1.0f,
-0.5f, -0.5f, -0.5f, -1.0f, 0.0f, 0.0f, 0.0f, 0.0f, 0.0f, 0.0f, 1.0f,
-0.5f, -0.5f, 0.5f, -1.0f, 0.0f, 0.0f, 1.0f, 0.0f, 0.0f, 0.0f, 1.0f,
-0.5f, 0.5f, 0.5f, -1.0f, 0.0f, 0.0f, 1.0f, 1.0f, 0.0f, 0.0f, 1.0f,
// bottom 12
-0.5f, -0.5f, -0.5f, 0.0f, -1.0f, 0.0f, 0.0f, 0.0f, 0.0f, 0.0f, 0.0f,
0.5f, -0.5f, -0.5f, 0.0f, -1.0f, 0.0f, 0.0f, 0.0f, 0.0f, 0.0f, 0.0f,
0.5f, -0.5f, 0.5f, 0.0f, -1.0f, 0.0f, 0.0f, 0.0f, 0.0f, 0.0f, 0.0f,
-0.5f, -0.5f, 0.5f, 0.0f, -1.0f, 0.0f, 0.0f, 0.0f, 0.0f, 0.0f, 0.0f,
// front 16
0.5f, 0.5f, 0.5f, 0.0f, 0.0f, 1.0f, 1.0f, 1.0f, 1.0f, 0.0f, 0.0f,
-0.5f, 0.5f, 0.5f, 0.0f, 0.0f, 1.0f, 0.0f, 1.0f, 1.0f, 0.0f, 0.0f,
-0.5f, -0.5f, 0.5f, 0.0f, 0.0f, 1.0f, 0.0f, 0.0f, 1.0f, 0.0f, 0.0f,
0.5f, -0.5f, 0.5f, 0.0f, 0.0f, 1.0f, 1.0f, 0.0f, 1.0f, 0.0f, 0.0f,
// back 20
0.5f, 0.5f, -0.5f, 0.0f, 0.0f, -1.0f, 0.0f, 1.0f, -1.0f, 0.0f, 0.0f,
0.5f, -0.5f, -0.5f, 0.0f, 0.0f, -1.0f, 0.0f, 0.0f, -1.0f, 0.0f, 0.0f,
-0.5f, -0.5f, -0.5f, 0.0f, 0.0f, -1.0f, 1.0f, 0.0f, -1.0f, 0.0f, 0.0f,
-0.5f, 0.5f, -0.5f, 0.0f, 0.0f, -1.0f, 1.0f, 1.0f, -1.0f, 0.0f, 0.0f,
};

If you want to read more about the tangent space, I suggest you follow this link:


Since I've added more data, I'm going to change the stride again and add a new attribute pointer, as well as update the other attribute pointers to reflect the change of stride. This will all be done in the ViewDidLoad method.

// ViewDidLoad method
// Update the stride
GLsizei stride = sizeof(GLfloat) * 11;
// Update the strides
attribute = glGetAttribLocation(_program, "VertexPosition");
glEnableVertexAttribArray(attribute);
glVertexAttribPointer(attribute, 3, GL_FLOAT, GL_FALSE, stride, NULL);
attribute = glGetAttribLocation(_program, "VertexNormal");
glEnableVertexAttribArray(attribute);
glVertexAttribPointer(attribute, 3, GL_FLOAT, GL_FALSE, stride, BUFFER_OFFSET(stride*3/11));
attribute = glGetAttribLocation(_program, "VertexTexCoord0");
glEnableVertexAttribArray(attribute);
glVertexAttribPointer(attribute, 2, GL_FLOAT, GL_FALSE, stride, BUFFER_OFFSET(stride*6/11));
// Add the new attribute
attribute = glGetAttribLocation(_program, "VertexTangent0");
glEnableVertexAttribArray(attribute);
glVertexAttribPointer(attribute, 3, GL_FLOAT, GL_FALSE, stride, BUFFER_OFFSET(stride*8/11));

I've also modified the draw loop just a bit, since we have two textures, we're going to have to use two channels to sample them. In the DrawInRect method:

else if (!strcmp(_uniformArray.Uniform[i].Name, "TextureSampler"))
{
glActiveTexture(GL_TEXTURE0);
glBindTexture(self.texture.target, self.texture.name);
glUniform1i(_uniformArray.Uniform[i].Location, 0);
}
else if (!strcmp(_uniformArray.Uniform[i].Name, "BumpSampler"))
{
glActiveTexture(GL_TEXTURE1);
glBindTexture(self.normalMap.target, self.normalMap.name);
glUniform1i(_uniformArray.Uniform[i].Location, 1);
}


Now, onto the shader code, starting with the vertex shader:

uniform mediump mat4 ModelViewMatrix;
uniform mediump mat4 ProjectionMatrix;
uniform mediump vec3 LightPosition;
uniform lowp mat3 NormalMatrix;
attribute mediump vec3 VertexPosition;
attribute lowp vec2 VertexTexCoord0;
attribute lowp vec3 VertexNormal;
attribute lowp vec3 VertexTangent0;
varying lowp vec3 PositionTS;
varying lowp vec3 LightDirectionTS;
varying lowp vec2 FragmentTexCoord0;
void main(void)
{
// New temp variable
mediump vec3 tmp;
// Calculate the normal, tangent, binormal and position in eye coordinates
lowp vec3 normal = NormalMatrix * VertexNormal;
lowp vec3 tangent = NormalMatrix * VertexTangent0;
lowp vec3 binormal = cross(normal, tangent);
PositionTS = vec3(ModelViewMatrix * vec4(VertexPosition, 1.0));
// Calculate the position in clip coordinates
gl_Position = ProjectionMatrix * vec4(PositionTS, 1.0);
// Calculate the light direction in eye coordinates
lowp vec3 LightDirectionES = normalize(LightPosition - PositionTS);
// Calculate the light direction in tangent coordinates
LightDirectionTS.x = dot(LightDirectionES, tangent);
LightDirectionTS.y = dot(LightDirectionES, binormal);
LightDirectionTS.z = dot(LightDirectionES, normal);
// Calculate the position in tangent coordinates
tmp.x = dot(PositionTS, tangent);
tmp.y = dot(PositionTS, binormal);
tmp.z = dot(PositionTS, normal);
PositionTS = normalize(tmp);
// Pass the texture coordinates to the fragment shader
FragmentTexCoord0 = VertexTexCoord0;
}


What's new:
  • (attribute l. 9) The tangent: VertexTangent0
  • (varying l. 11) Position in tangent space: PositionTS (which we will also use as a temp var)
  • (varying l. 12) Light direction in tangent space: LightDirectionTS
  • No more LightColor: lighting will be calculated in the fragment shader using the normal map
And now the fragment shader:

uniform sampler2D TextureSampler;
uniform sampler2D BumpSampler;
varying lowp vec3 PositionTS;
varying lowp vec3 LightDirectionTS;
varying lowp vec2 FragmentTexCoord0;
void main(void)
{
// Get the normal by sampling the normal map and convert to vector
lowp vec3 normal = texture2D(BumpSampler, FragmentTexCoord0).rgb * 2.0 - 1.0;
// Calculate the diffuse light intensity
lowp float intensity = max(dot(LightDirectionTS, normal), 0.0);
// Enter a minimum value which represents the ambient light
gl_FragColor = vec4(0.1);
// Calculate the reflection vector for the specular light
lowp vec3 reflectionVector = normalize(reflect(LightDirectionTS, normal));
if(intensity > 0.0)
{
// Add the diffuse light
gl_FragColor += texture2D(TextureSampler, FragmentTexCoord0) * vec4(1.0) * intensity;
// Add te specular light
gl_FragColor += vec4(1.0) * pow(max(dot(reflectionVector, PositionTS), 0.0), 15.0);
}
}


What's new:

  • BumpSampler to sample the normal map
  • The three varying variables passed from the vertex shader
And here's the result:


Here's the final project with bump mapping: Project

2 comments:

todd412 said...

This site is great! Extremely helpful perspectives on a very difficult subject. Esp. with regards to GLKit and its strengths and weaknesses. Kudos.

If you are planning to add more, might I suggest a demonstration of the use of multiple objects (with different shaders perhaps) and the use of shadows. Perhaps one object that casts a shadow on a second object?

Gabriel Gohier-Roy said...

Thanks for the comment!

I always wanted to do more, but what I lack of with my new job is time.

If I ever get back to writing more articles, there was a couple of things I had already planned, some of them in the line of what you suggested.

1) Do an update on GLKit with iOS 6. I didn't check yet but I'm hoping that they added more features with the new version of iOS.

2) Putting together a scene (Multiple objects)

3) Uber shaders (Multiple shaders)

4) Simple effects (Possibly shadows like you suggested)

This blog was a pet project during my last year in College and also during the 3 month break I took right after. Hopefully I will find the time to write some more!