Generating a normal map from a height map?

59,693

Solution 1

Example GLSL code from my water surface rendering shader:

#version 130
uniform sampler2D unit_wave
noperspective in vec2 tex_coord;
const vec2 size = vec2(2.0,0.0);
const ivec3 off = ivec3(-1,0,1);

    vec4 wave = texture(unit_wave, tex_coord);
    float s11 = wave.x;
    float s01 = textureOffset(unit_wave, tex_coord, off.xy).x;
    float s21 = textureOffset(unit_wave, tex_coord, off.zy).x;
    float s10 = textureOffset(unit_wave, tex_coord, off.yx).x;
    float s12 = textureOffset(unit_wave, tex_coord, off.yz).x;
    vec3 va = normalize(vec3(size.xy,s21-s01));
    vec3 vb = normalize(vec3(size.yx,s12-s10));
    vec4 bump = vec4( cross(va,vb), s11 );

The result is a bump vector: xyz=normal, a=height

Solution 2

My thinking is that each pixel of the image represents a lattice coordinate in a 256 x 256 grid (hence, why there are 257 x 257 heights). That would mean that the normal at coordinate (i, j) is determined by the heights at (i, j), (i, j + 1), (i + 1, j), and (i + 1, j + 1) (call those A, B, C, and D, respectively).

No. Each pixel of the image represents a vertex of the grid, so intuitively, from symmetry, its normal is determined by heights of neighboring pixels (i-1,j), (i+1,j), (i,j-1), (i,j+1).

Given a function f : ℝ2 → ℝ that describes a surface in ℝ3, a unit normal at (x,y) is given by

v = (−∂f/∂x, −∂f/∂y, 1) and n = v/|v|.

It can be proven that the best approximation to ∂f/∂x by two samples is archived by:

∂f/∂x(x,y) = (f(x+ε,y) − f(x−ε,y))/(2ε)

To get a better approximation you need to use at least four points, thus adding a third point (i.e. (x,y)) doesn't improve the result.

Your hightmap is a sampling of some function f on a regular grid. Taking ε=1 you get:

2v = (f(x−1,y) − f(x+1,y), f(x,y−1) − f(x,y+1), 2)

Putting it into code would look like:

// sample the height map:
float fx0 = f(x-1,y), fx1 = f(x+1,y);
float fy0 = f(x,y-1), fy1 = f(x,y+1);

// the spacing of the grid in same units as the height map
float eps = ... ;

// plug into the formulae above:
vec3 n = normalize(vec3((fx0 - fx1)/(2*eps), (fy0 - fy1)/(2*eps), 1));

Solution 3

A common method is using a Sobel filter for a weighted/smooth derivative in each direction.

Start by sampling a 3x3 area of heights around each texel (here, [4] is the pixel we want the normal for).

[6][7][8]
[3][4][5]
[0][1][2]

Then,

//float s[9] contains above samples
vec3 n;
n.x = scale * -(s[2]-s[0]+2*(s[5]-s[3])+s[8]-s[6]);
n.y = scale * -(s[6]-s[0]+2*(s[7]-s[1])+s[8]-s[2]);
n.z = 1.0;
n = normalize(n);

Where scale can be adjusted to match the heightmap real world depth relative to its size.

Solution 4

If you think of each pixel as a vertex rather than a face, you can generate a simple triangular mesh.

+--+--+
|\ |\ |
| \| \|
+--+--+
|\ |\ |
| \| \|
+--+--+

Each vertex has an x and y coordinate corresponding to the x and y of the pixel in the map. The z coordinate is based on the value in the map at that location. Triangles can be generated explicitly or implicitly by their position in the grid.

What you need is the normal at each vertex.

A vertex normal can be computed by taking an area-weighted average of the surface normals for each of the triangles that meet at that point.

If you have a triangle with vertices v0, v1, v2, then you can use a vector cross product (of two vectors that lie on two of the sides of the triangle) to compute a vector in the direction of the normal and scaled proportionally to the area of the triangle.

Vector3 contribution = Cross(v1 - v0, v2 - v1);

Each of your vertices that aren't on the edge will be shared by six triangles. You can loop through those triangles, summing up the contributions, and then normalize the vector sum.

Note: You have to compute the cross products in a consistent way to make sure the normals are all pointing in the same direction. Always pick two sides in the same order (clockwise or counterclockwise). If you mix some of them up, those contributions will be pointing in the opposite direction.

For vertices on the edge, you end up with a shorter loop and a lot of special cases. It's probably easier to create a border around your grid of fake vertices and then compute the normals for the interior ones and discard the fake borders.

for each interior vertex V {
  Vector3 sum(0.0, 0.0, 0.0);
  for each of the six triangles T that share V {
    const Vector3 side1 = T.v1 - T.v0;
    const Vector3 side2 = T.v2 - T.v1;
    const Vector3 contribution = Cross(side1, side2);
    sum += contribution;
  }
  sum.Normalize();
  V.normal = sum;
}

If you need the normal at a particular point on a triangle (other than one of the vertices), you can interpolate by weighing the normals of the three vertices by the barycentric coordinates of your point. This is how graphics rasterizers treat the normal for shading. It allows a triangle mesh to appear like smooth, curved surface rather than a bunch of adjacent flat triangles.

Tip: For your first test, use a perfectly flat grid and make sure all of the computed normals are pointing straight up.

Share:
59,693
Dawson
Author by

Dawson

Updated on July 09, 2022

Comments

  • Dawson
    Dawson almost 2 years

    I'm working on procedurally generating patches of dirt using randomized fractals for a video game. I've already generated a height map using the midpoint displacement algorithm and saved it to a texture. I have some ideas for how to turn that into a texture of normals, but some feedback would be much appreciated.

    My height texture is currently a 257 x 257 gray-scale image (height values are scaled for visibility purposes):

    enter image description here

    My thinking is that each pixel of the image represents a lattice coordinate in a 256 x 256 grid (hence, why there are 257 x 257 heights). That would mean that the normal at coordinate (i, j) is determined by the heights at (i, j), (i, j + 1), (i + 1, j), and (i + 1, j + 1) (call those A, B, C, and D, respectively).

    So given the 3D coordinates of A, B, C, and D, would it make sense to:

    1. split the four into two triangles: ABC and BCD
    2. calculate the normals of those two faces via cross product
    3. split into two triangles: ACD and ABD
    4. calculate the normals of those two faces
    5. average the four normals

    ...or is there a much easier method that I'm missing?

  • Marnix
    Marnix about 13 years
    @Toolbox: This is indeed a very nice way to do this. You don't need to calculate your normals by the CPU, just pass it to the GPU with GLSL. This method for using the heightmap as normals is called Bump mapping I believe. So I vote for @kvark! It has nothing to do with vertices and splitting up in triangles.
  • Dawson
    Dawson about 13 years
    @Adian McCarthy - Unfortunately, I don't think I can afford to generate an actual mesh, particularly since it's just for a flat stretch of dirt. But thank you for the explanation anyways!
  • sinisterchipmunk
    sinisterchipmunk over 12 years
    This is a great solution, but I had to make some changes: vec3 va = normalize(vec3(size.x, s21-s01, size.y)); vec3 vb = normalize(vec3(size.y, s12-s10, -size.x)); While switching Y and Z are no big deal, I thought it was interesting that I had to subtract s21-s01 instead of s21-s11 as the example indicates. I also had to negate size.x in vb.
  • ds-bos-msk
    ds-bos-msk over 10 years
    Just out of curiosity, in your experience, is it more efficient to just calculate bump-mapping from a prepared normal map, or to get the normals on the fly from a heightmap in the fragment shader? I wouldn't be surprised if this way would be faster because you're reading 4 times less texture data than you would with a normal map, and you don't even need tangents or binormals as interpolants or as vertex attributes. On the other hand this method is heavier on the ALUs and SFs in the fragment stage...
  • kvark
    kvark over 10 years
    @bigD passing tangents/bitantents has nothing to do with deriving normals from the heigthmap - it's a matter of height/normal map interpretation. Height map is used for water because it's the output of some simulation algorithm. For other cases a regular normal map is more efficient.
  • ds-bos-msk
    ds-bos-msk over 10 years
    @kvark Yes sorry you're right. You would normally still have to pass tangents/binormals, or a quaternion or something. Although you could use the glsl built-in derivative functions to derive orientation information from texture coordinates (for either normal maps or height maps), but that might actually have some serious view dependent error/inaccuracy...
  • Ash
    Ash almost 8 years
    Thanks for this. As a matter of note for people in my position, I was implementing this as a Core Image filter for MacOS and kept getting the strangest results - they were different every time I ran the filter with the same image. Turns out it was something that some might consider a newbie mistake - I'm new to the kernel. It didn't like the number literals in the formulas. I made a constant const float d = 2.0 and substituted that for the 2 in the calculations and bam, it worked beautifully.
  • j00hi
    j00hi over 6 years
    I do not quite get it why the vectors va and vb have to be calculated using 2.0, 0.0. Why is it that we need exactly the numbers 2 and 0 here, resulting in vectors (2, 0, s21-s01) and (0, 2, s12-s10). Can someone explain this mathematically?
  • kvark
    kvark over 6 years
    @j00hi because the height difference we put into Z is for texels to the other sides of the current one, so the distance between texels is 2, and we are interested in height change per texel.
  • Richard
    Richard over 4 years
    this algorithm makes a completely different image unfortunately, it detects edges but flat areas are different colours (in a normal normal map, it is mostly a flat blue colour) i wanted to leave a comment as i wasted time porting it
  • jozxyqk
    jozxyqk over 4 years
    @Richard try scaling the result from the [-1, 1] range to [0, 1], i.e. n * 0.5 + 0.5.
  • Kaaf
    Kaaf almost 4 years
    Not sure why you are normalizing the tangents and not the final normals that should be normalized.
  • Tim Autin
    Tim Autin almost 4 years
    Thank you very much for this snippet!! I'm trying to migrate my CPU based normals to a GPU based version using your shader. It nearly works but something is still wrong with the lightning. I posted a question, could you check it? stackoverflow.com/questions/62519673/cpu-to-gpu-normal-mappi‌​ng