Advertisement

Computing normals for lighting?

Started by November 12, 2014 05:03 PM
3 comments, last by Buckeye 10 years, 2 months ago

So I got this code to compute a normal. Here it is.


void ComputeNormal(const D3DXVECTOR3& p0,
const D3DXVECTOR3& p1,
const D3DXVECTOR3& p2,
D3DXVECTOR3& out)
{
D3DXVECTOR3 u = p1 - p0;
D3DXVECTOR3 v = p2 - p0;
D3DXVec3Cross(&out, &u, &v);
D3DXVec3Normalize(&out, &out);
}

And I got my drawing part of my directx right here. draws a pyramid


void LightingApp::DRAWMYPYRMAID()
{

	D3DXVECTOR3 test;
	
	ComputeNormal(D3DXVECTOR3(-2.0f, 0.0f, -2.0f),DD3DXVECTOR3(0.0f, 2.0f, 0.0f),3DXVECTOR3(2.0f, 0.0f, -2.0f),test);


	 Vertex vertices[] =
    {
		{ XMFLOAT3(-2.0f, 0.0f, 2.0f)  },
		{ XMFLOAT3(2.0f, 0.0f, 2.0f) },
		{ XMFLOAT3(0.0f, 2.0f, 0.0f)   },
		{ XMFLOAT3(-2.0f, 0.0f, -2.0f)  },
		{ XMFLOAT3(2.0f, 0.0f, -2.0f), }
    };

	 
	
	
	
    D3D11_BUFFER_DESC vbd;
    vbd.Usage = D3D11_USAGE_IMMUTABLE;
    vbd.ByteWidth = sizeof(Vertex) * 5;
    vbd.BindFlags = D3D11_BIND_VERTEX_BUFFER;
    vbd.CPUAccessFlags = 0;
    vbd.MiscFlags = 0;
	vbd.StructureByteStride = 0;
    D3D11_SUBRESOURCE_DATA vinitData;
    vinitData.pSysMem = vertices;
    HR(md3dDevice->CreateBuffer(&vbd, &vinitData, &mBoxVB));


Vertex normal[] =
    {

{ XMFLOAT3(test.x, test.y, test.z)  }


    };


D3D11_BUFFER_DESC vbds;
    vbds.Usage = D3D11_USAGE_IMMUTABLE;
    vbds.ByteWidth = sizeof(Vertex) * 1;
    vbds.BindFlags = D3D11_BIND_VERTEX_BUFFER;
    vbds.CPUAccessFlags = 0;
    vbds.MiscFlags = 0;
vbds.StructureByteStride = 0;
    D3D11_SUBRESOURCE_DATA vinitDatas;
    vinitDatas.pSysMem = normal;
    HR(md3dDevice->CreateBuffer(&vbds, &vinitDatas, &mBoxVBs));	

	// Create the index buffer

	UINT indices[] = {
		
		0,1,4,
		4,3,0,
		0,2,1,
		1,2,4,
		4,2,3,
		3,2,0

		
	};

	D3D11_BUFFER_DESC ibd;
    ibd.Usage = D3D11_USAGE_IMMUTABLE;
    ibd.ByteWidth = sizeof(UINT) * 18;
    ibd.BindFlags = D3D11_BIND_INDEX_BUFFER;
    ibd.CPUAccessFlags = 0;
    ibd.MiscFlags = 0;
	ibd.StructureByteStride = 0;
    D3D11_SUBRESOURCE_DATA iinitData;
    iinitData.pSysMem = indices;
    HR(md3dDevice->CreateBuffer(&ibd, &iinitData, &mBoxIB));


	
}

Im getting two sides colored and the whole bottom colored. Im only calculating one normal for one side, so I only want one triangle to be colored. But im seeing two sides colored and the bottom colored. What am I messing up on in the code? how can I resolve this?

Why are you creating 2 vertex buffers? Why not just store normals with vertex positions? I understand that in some cases this may be desired.

Anyway, your second vertex buffer doesn't contain same amount of data as the first one - so probably one of the vertices has a normal and the rest use some garbage.

Since normals are defined per vertex it is possible that you see the lighting applied as you describe.

Cheers!

Advertisement

I think im confused here. Do I use Face normal or Vertex Normals for lighting?

if I use vertex normals, is this code correct?


Vertex normal[] =
    {
		
		{ XMFLOAT3(1.0f, 0.0f, 1.0f)  },
		{ XMFLOAT3(1.0f, 0.0f, 1.0f)  },
		{ XMFLOAT3(0.0f, 1.0f, 0.0f)  },
		{ XMFLOAT3(1.0f, 0.0f, 1.0f)  },
		{ XMFLOAT3(1.0f, 0.0f, 1.0f)  },




		
    };

Vertex shader is executed once per vertex - and typically you'll store normals in the vertex data, so using vertex normals is the usual way of lighting an object.

Lighting objects like pyramid correctly with per-vertex normal data requires you to duplicate vertices at the corners in order to have uniform surface normal for the each of the sides. The same applies to objects like cube - one vertex normal can't be correct for multiple sides of a cube.

Of course, nothing prevents you from calculating the triangle normal in a geometry shader since it works on primitives instead of vertices - but I would just create some extra vertices, that is, 3 vertices for each side and 4 vertices for the bottom totaling 3 * 4 + 4 vertices = 16.

Cheers!

I think im confused here. Do I use Face normal or Vertex Normals for lighting?

Which do you want to use? Either one can be used. But, for indexed vertices, face normals are (commonly) needed.

EDIT: As kauna mentions, providing face normals for indexed vertices is a bit more complicated. Rather than those complications, see below "For face normals.."

Vertex normals (as kauna mentions) are normally part of the vertex structure, and sent to the pipeline. Either the vertex shader computes the diffuse contribution from the transformed normal and the light direction, and passes the color on to the pixel shader; or the transformed normal is passed on to the pixel shader and lighting is calculated there.

For face normals, I would recommend calculating them in the pixel shader, by passing the world position (position * worldMatrix) of each vertex to the pixel shader, and calc the normal in the pixel shader. That keeps the vertex buffer as position-only (or with color and/or texcoords) but no normal needed.


// vertex shader
    output.Pos = mul(input.Pos, World); // whatever you use
    output.worldPos = output.Pos.xyz;
    ... [view-projection and other stuff follows]
// pixel shader
float3 normal = normalize(cross(ddx(input.worldPos), ddy(input.worldPos)));
float color = dot(normal, normalize(lightDir)); // lightDir is provided in a constant buffer

Please don't PM me with questions. Post them in the forums for everyone's benefit, and I can embarrass myself publicly.

You don't forget how to play when you grow old; you grow old when you forget how to play.

This topic is closed to new replies.

Advertisement