Bump Map Generation

Bump mapping is a texture-based technique that allows improving the lighting model of a 3D renderer. I’m a big fan of bump mapping; I think it’s a great way to really make the graphics of a renderer pop at no additional geometry processing cost.

Bump mapping example.
Bump mapping example. Notice the improved illusion of depth generated by the technique. Image taken from http://3dmodeling4business.com

Much has been written about this technique, as it’s widely used in lots of popular games. The basic idea is to perturb normals used for lighting at the per-pixel level, in order to provide additional shading cues to the eye.

The beauty of this technique is that it doesn’t require any additional geometry for the model, just a new texture map containing the perturbed normals.

This post covers the topic of bump map generation, taking as input nothing but a diffuse texture. It is based on the techniques described in the books “More OpenGL” by Dave Astle and “Mathematics for 3D Games And Computer Graphics” by Eric Lengyel.

Let’s get started! Here’s the Imp texture that I normally use in my examples. You might remember the Imp from my Shadow Mapping on iPad post.

Diffuse texture map of the Imp model.
Diffuse texture map of the Imp model.

The idea is to generate the bump map from this texture. In order to do this, what we are going to do is analyze the diffuse map as if it were a heightmap that describes a surface. Under this assumption, the bump map will be composed of the surface normals at each point (pixel).

So, the question is, how do we obtain a heightmap from the diffuse texture? We will cheat. We will convert the image to grayscale and hope for the best. At least this way we will be taking into account the contribution of each color channel for each pixel we process.

Let’s call H the heightmap and D the diffuse map. Converting an image to grayscale can be easily done programatically using the following equation:

  \forall (i,j) \in [0..width(D), 0..height(D)], H_{i,j} = red(D_{i,j}) * 0.33 + green(D_{i,j})* 0.66 + blue(D_{i,j}) * 0.11

 
As we apply this formula to every pixel, we obtain a grayscale image (our heightmap), shown in the next figure:

A grayscale conversion of the Imp diffuse texture.
A grayscale conversion of the Imp diffuse texture.

Now that we have our heightmap, we will study how the grayscale colors vary in the horizontal s and in the vertical t directions . This is a very rough approximation of the surface derivative at the point and will allow approximating the normal later.

If H_{i,j} is the grayscale value stored in the heightmap at the point (i,j) , then we approximate the derivatives s and t like so:

  s_{i,j} = (1, 0, H_{i+1,j}-H_{i-1,j})  \\  t_{i,j} = (0, 1, H_{i, j+1}-H_{i,j-1})

 

s and t are two vectors perpendicular to the heightmap at point (i,j) . What we can now do is take their cross product to find a vector perpendicular to both. This vector will be the normal of the surface at point (i,j) and is, therefore, the vector we were looking for. We will store it in the bump map texture.

  N = \frac{s \times t}{||s \times t||}

 

After applying this logic to the entire heightmap, we obtain our bump map.

We must be careful when storing a normalized vector in a texture. Because vector components will be in the [-1,1] range, but values we can store in the bitmap need to be in the [0, 255] range, we will have to convert between both value ranges to store our data as color.

A linear conversion produces an image like the following:

Bump map generated from the Imp's diffuse map.
Bump map generated from the Imp’s diffuse map, ready to be fed into the video card.

Notice the prominence of blue, which represents normals close to the (unperturbed) (0,0,1) vector. Vertical normals end up being stored as blueish colors after the linear conversion.

We are a bit more interested in the darker areas, however. This is where the normals are more perturbed and will make the Phong equation subtly affect shading, expressing “discontinuities” in the surface that the eye will interpret as “wrinkles”.

Other colors will end up looking like slopes and/or curves.

In all fairness, the image is a bit more grainy than I would’ve liked. We can apply a bilinear filter on it to make it smoother. We could also apply a scale to the s and t vectors to control how steep calculated normals will be.

However, since we are going to be interpolating rotated vectors during the rasterization process, these images will be good enough for now.

I’ve written a short Python script that implements this logic and applies it on any diffuse map. It is now part of the Vortex Engine toolset.

In my next post I’m going to discuss how to implement the vertex and fragment shaders necessary to apply bump mapping on a trivial surface. Stay tuned!