Advertisement

Normal Mapping Code

Started by June 20, 2004 02:00 PM
43 comments, last by cippyboy 20 years, 5 months ago
@vincoof: Sorry for that :) I've re-uploaded. Now in ZIP file, all in JPG format. For DDS I just use some totaly hacked code (I dont want to touch it while it "works").
You should never let your fears become the boundaries of your dreams.
Thanks Mirko it's much better now :)
It's true that a good set of textures and models help alot in creating effects. Sometimes you just code an effect and it looks crap on a quad so you think it's a bad effect, but when you apply it on spheres it's wonderful. Sometimes it's the opposite that happens. Sometimes it's more complex than that to see the true power of an effect. Just remember the modeling part is very important even on the developer's side (who, unfortunately, just tends to be a really crap artist in general).
</blabbling>
Advertisement
[_DarkWIng_] Thanks a lot for the textures... really interesting ;) but... I seem to have gray looking stuff... I don`t know why, I have to check this out, either one of the textures lookes gray, maybe it`s me it`s my algo, gotta check.

[vincoof] True,true, especially the part with "the modeling part is very important even on the developer's side (who, unfortunately, just tends to be A REALLY CRAP ARTIST in general)." you just made the point, cheers :)

Relative Games - My apps

@vincoof: *sigh* cannot agree more. I could add something: I noticed that using (strong) JPG compression for height maps/normal maps sometimes gives really bad results. Maybe generated textures could be a solution (I'm thinking about SVG rendering + a bunch of filters).
BTW, people who missed this should really take a look : impressive...

@darkwing: Your textures look really good ! Maybe the NeHe community should try to gather free textures and models so that poor coders don't have to worry about art stuff to test their new stuff.
Hmmm do you have a height map for the "pierre" texture ? It would probably look good with parallax mapping...
SaM3d!, a cross-platform API for 3d based on SDL and OpenGL.The trouble is that things never get better, they just stay the same, only more so. -- (Terry Pratchett, Eric)
Every texture in general with strong compression looks crap. But it's true that it tends to be even more critical for normal maps.
Quote: Original post by vincoof
Every texture in general with strong compression looks crap. But it's true that it tends to be even more critical for normal maps.


Hmmm I shouldn't have written "strong". In fact "0.85 jpeg quality" is enough to make bad normal maps (when surfaces are smooth).
SaM3d!, a cross-platform API for 3d based on SDL and OpenGL.The trouble is that things never get better, they just stay the same, only more so. -- (Terry Pratchett, Eric)
Advertisement
@rodzilla: Here you go, but it's not very good becouse it was created from diffuse texture with a bit of filtring.

In general, this textures aren't mine. I got them from different sites and free texture packs. I just created normalmaps and hightmaps from them it I need them. This is a good site to get some *very* good textures. Also nVidia put out a pact of high-res textures & bumpmpas that can be used in demos and so.

Texture compression can be a b***h. Personaly I stick with low-compression jpeg for diffuse textures and heightmaps that don't need that much detail and dds or tga for the rest (normalmaps/lookup tables).
You should never let your fears become the boundaries of your dreams.
There really isn't too much material that is out there that will simply give you code to do bump mapping. It's a very complex thing to do, especially with tangent space normal maps (ie, all the normal maps your using)...


The process of normal mapping:


The format of the normal map, is XYZ normal = (RGB * 2 -1) where R G and B are in the range [0,1]. ie, the normal map is stored in [0,1] but you want it in [-1,1], as a proper 3d vector. Ideally normalized. This will eventually be done in a pixel shader.


Now. the problem is that you will be using tangent space normal maps. Ignore this term for the moment, but let me explain the problem it refers to:

A flat normal map (ie, no bumps at all) will be 128,128,255 (light blue) as RGB bytes.. which expands to [0,0,1] - ie, straight up. Look at darkwings example normal maps, and the flat parts are all this light blue colour.

But, the problem is that this texture is applied to a model, or some other complex peice of geometry.. If you just use the normal map as 'the normal' then the normals accross the entire model will be [0,0,1].. Which is incorrect (in the lighting equation this will result in the entire model being the same colour). - vertex normals are being ignored

Hopfully that makes sense, as it's key to getting normal mapping right

This is where tangent spaces come in.

Tanget space is a fancy name for roatating the normal you get from the normal map (sortof).

You know the normal from the normal map, and you also know the vertex normal.

If you make a simple vertex shader to output just the vertex normal, you will see that the interpolated vertex normals will likly vary wildly over the surface of the model.. and will be quite ugly really. (whenever you are unsure of something, use a couple of shader to visualize it, this helps enormously)

So. what we effectivly want to end up doing, is to 'rotate' the normal from the normal map realitve to the rotation of the vertex normal (does this make sense?) - ie, for a flat normal map, you want to roatate [0,0,1] so it becomes exactly the same as the vertex normal. (sortof - I'll get to the catches soon)

The problem here is you need a matrix to rotate. A 3x3 matrix. Thats 3 sets of xyz. All you have is the xyz of the normal, and you can't really get a good matrix from just this one vector, there is too much missing information..

So. We need tangent space vectors.

These are probably some of the most impossible things to visualize there is (well, initally at least) - until you know what they are.

One of the reasons we need them is a problem that occurs if you just generate the needed matrix from the normal...

You could roughly generate this matrix by cross-producting the normal with either 1,0,0 or 0,1,0 (it's possible you will get a zero length vector from either) - that result can be the Y rotation vector of the matrix, and cross this and the normal to get the last vector, X. But, this produces a really dodgy problem, which I've seen in the odd demo posted here...

If you rotate the flat normal map - [0,0,1] - by this matrix, you will get the same as the vertex normal (since the matrix mult equation is effectivly v.x*X + v.y*Y + v.z*Z - and here only v.z is nonzero - so the output is Z from the matrix, or the vertex normal) --- the problem comes when you do not have a flat normal map. When thinking in terms of the vertex normal, what is left? what is right? etc, or in more mathematical terms, what is x and y on the normal map in terms of the vertex normal?...

Think of it this way, if you had a texture with red lines running accross it (flat, on x axis), and green ones running down it, how would they appear on the model when you apply it as a texture (instead of your standard texture) ?? the red lines would be running on the X axis of the 'tangent space' of each vertex... As will the green lines be running on the Y tangent space axis. These directions represnet what you want [X,Y,..] in the normal map to do to the direction of the transformed normal.

This is what tangent space vectors solve, each vertex, along with a normal, also has one or two tangent space vectors (if you use one, you can generate the third with a cross product - but can't have mirrored textures without some tricks)

Calculating tangent vectors is difficult. I won't give you code here because the code I wrote to do this is quite complex (all sorts of matrix stuff) - I'm sure there will be a better way thats easier to understand. - But it will eventually come down to working out the tangent vectors for each triangle, then adding it to each vertex, then normalizing the vertex tangent vectors.

I personally store tangent vectors in the first texture coordinate, as a float4 (to help with the mirroring problem). D3D provides a direct 'tangent' vertex data type however. - think something along the likes of glTangent3f :)

So.

Thats the big obstacle.

But what this will eventaully give you is a matrix per vertex (2 or 3 xyz vectors - to make a 3x3 matrix). This is done in the vertex shader, not the pixel shader. And this is where a trick comes in that probably stumps a lot of people.

You'd think with this matrix, you would then go ahead and rotate the normal map normal [x,y,z] by the matrix... But, no, you don't (this is very inefficient)... What you do, is you rotate the _light_... per vertex. But because it's the light, not the normal, you need to use the inverse of the matrix, which convieniently is just the transpose (since it's an normalized matrix). - but I guess you can always just rotate the normal map values in the pixel shader - as this actually avoids some issues with interploation 'spots' you can get on really low detail curved models with point lights... but never mind. :)

so..

start off with something really simple.

Get your shaders just displaying the vertex normal.

Then try and calculate tangent spaces. You will need a tutorial on this. They are also quite hard to find.
Or do it yourself. I did. You learn much more this way.

Visualize the tangent spaces with GL_LINES. eventually, with the normal, and two tangents, you should get 3 lines at each vertex, with a length of 1, that are all at right angles to each other.

then, give a go at displaying the normal map using a shader (easy :) - remember the ((value*2) -1) bit to make it [-1,1]

this time, however, pass in the normals/tangents into the pixel shader (through texture coords or whatnot), normalize them, and then make a matrix from them (this might be difficult in a pixel shader, so you might just need to do the raw multiply with the vectors - like I mentioned above.
If all goes right, your flat normal map should look exactly the same as the vertex normal. If it doesn't, your've done something wrong. So visualize all your variables and try and fix this.

Once this is done, use a more complex normal map and see what happens..

If all goes ok, then DOT the transformed normal map value with a fixed direction (eg, 0,0,-1) to simulate a directional light. All going well this will work. If it doesn't, visualize and fix.

Then you can move on to things like point-lights, which are a lot harder.

I won't go over inverse biased falloff and all that stuff, you can work it out for youself. :) it's more interesting that way.

However I can say that getting the light direction to a point is more complex now :)


Ahh well. enjoy. :)

samples:
- this is not quite how I do it, but close
//vertex shader//vertex formatstruct input{    	float4 position			: POSITION;    	float3 normal			: NORMAL;	float3 tangent			: TEXCOORD0;	float2 tex			: TEXCOORD1;};struct output{	float4	position		: POSITION;	float2	texCoords		: TEXCOORD0;	float3	lightVector		: TEXCOORD1;	float3	viewVector		: TEXCOORD2;	...;}//shader (rotate light)void main...{	...    	OUT.position		= mul(ModelViewProj, IN.position);	...	float3x3 tangent= {IN.tangent.xyz,normalize(cross(IN.tangent.xyz,IN.normal)),IN.normal};		OUT.lightVector = ...;	OUT.lightVector	= mul(tangent,OUT.lightVector);		    	etc...}		


that may help with a head start...
ok I didn't mean to sound patronizing or anything, just letting you know.

I thought it'd be best to go over normal mapping start-to-finish just to make sure - there are too many steps you can miss.. plus it should help others reading and interested in the technique.
[RipTorn] Thanks a lot, somehow I regained confidence in that stuff :) but... as I said before I already "imported" code for calculating the tangent space from the "NMView" from ATIs NormalMapper(because it`s previewing Normal Mapping, so... It does Normal Mapping so I copyed the code... it wasn`t too easy either).
Now I`ve seen some differences with some codes, like... places where they use GL_DOT3_RGBA_EXT and... I don`t use alpha ? why RGBA ? and for the normal maps(code generation process I mean) I`ve seen that alpha is 0xcc(constant) which is 154(if I remeber corectly, so that`s 0 and a little something in the equation right ?)

Vertex/Pixel Shaders... ? I haven`t entered that area, it feels strange, it feels like writing a huge amount of code, but I`ll do it some day :) PS:with a normal map(external) and a vertex program(which is a string, and I can copy-paste right ?) I could have normal mapping without too much stress ? Because after it`s working I`d try to understand, sometimes you just write the code accordingly and you miss simple variables/stuff and the whole thing isn`t working, and I can`t even remember how often this happens :)

Relative Games - My apps

This topic is closed to new replies.

Advertisement