Advertisement

DirectX Input Layout w/ Instancing Problem

Started by September 05, 2018 07:20 PM
21 comments, last by vinterberg 6 years, 5 months ago

Hi all I am attempting to follow the Rastertek tutorial http://www.rastertek.com/dx11tut37.html

 

Right now I am having a problem, it appears that my input layout is not being initialized properly and I'm not sure why, an exception is being thrown when i call CreateInputLayout...

 

Exception thrown at 0x00007FFD9B8EA388 in MyGame.exe: Microsoft C++ exception: _com_error at memory location 0x0000000D4D18ED30.

 

Maybe you all can point out where I'm going wrong here?


void Renderer::InitPipeline()
{
	// load and compile the two shaders
	ID3D10Blob *VS, *PS;
	D3DX11CompileFromFile("Shaders.shader", 0, 0, "VShader", "vs_4_0", 0, 0, 0, &VS, 0, 0);
	D3DX11CompileFromFile("Shaders.shader", 0, 0, "PShader", "ps_4_0", 0, 0, 0, &PS, 0, 0);

	// encapsulate both shaders into shader objects
	dev->CreateVertexShader(VS->GetBufferPointer(), VS->GetBufferSize(), NULL, &pVS);
	dev->CreatePixelShader(PS->GetBufferPointer(), PS->GetBufferSize(), NULL, &pPS);

	// set the shader objects
	devcon->VSSetShader(pVS, 0, 0);
	devcon->PSSetShader(pPS, 0, 0);

	// create the input layout object
	D3D11_INPUT_ELEMENT_DESC ied[] =
	{
		{ "POSITION", 0, DXGI_FORMAT_R32G32B32_FLOAT, 0, 0, D3D11_INPUT_PER_VERTEX_DATA, 0 },
		{ "COLOR", 0, DXGI_FORMAT_R32G32B32A32_FLOAT, 0, 12, D3D11_INPUT_PER_VERTEX_DATA, 0 },
		// Add another input for the instance buffer
		{ "INSTANCE", 0, DXGI_FORMAT_R32G32B32_FLOAT, 1, 0, D3D11_INPUT_PER_INSTANCE_DATA, 1}
	};

	dev->CreateInputLayout(ied, 2, VS->GetBufferPointer(), VS->GetBufferSize(), &pLayout);
	devcon->IASetInputLayout(pLayout);
}

 

image.thumb.png.4150d1457d08de53edf5285ca3e7caf8.png

 

If I have not provided enough information, please help me understand what is needed so I can provide the info.

Anything from the debug layer?

If you don't know to enable debug layer, it's very simple. Just add D3D11_CREATE_DEVICE_DEBUG to flags when creating a Direct3D device.


UINT creationFlags = 0;

#ifdef _DEBUG
creationFlags |= D3D11_CREATE_DEVICE_DEBUG;
#endif

D3D_FEATURE_LEVEL capabilities;
D3D11CreateDeviceAndSwapChain(  NULL,
                                D3D_DRIVER_TYPE_HARDWARE,
                                NULL,
                                creationFlags,
                                featureLevels, //D3D_FEATURE_LEVEL *
                                ARRAYSIZE(featureLevels), //number of feature levels
                                D3D11_SDK_VERSION,
                                &swapChainDesc,
                                &m_swapChain,
                                &m_device,
                                &capabilities,
                                &m_devCon);

Check this for further reading.

If you enable this, I'm sure you'll fix the error in no time.

Tell me if you fixed the issue or you need further help.

Advertisement

It appears I am missing something to be able to enable this flag:

 

D3D11CreateDevice: Flags (0x2) were specified which require the D3D11 SDK Layers for Windows 10, but they are not present on the system.
These flags must be removed, or the Windows 10 SDK must be installed.
Flags include: D3D11_CREATE_DEVICE_DEBUG
Exception thrown at 0x00007FFD9B8EA388 in MyGame.exe: Microsoft C++ exception: _com_error at memory location 0x000000F646BDE660.
Exception thrown: read access violation.
this->**swapchain** was nullptr.

 

I'll have to give this a try on my home PC.

Till you get on your home PC. Maybe there's a typo.

Check the spelling of "POSITION", "COLOR" and "INSTANCE" in shader.

Check is VS is a valid pointer (ie not nullptr).

These are some mistakes I've made in the past, maybe it's one of them.

Hi kavarna, now that I'm at home we have intermittent power outages, as soon as it clears up I'll check all this. Cant win today!

Before power cut on me I got the debugger turned on and it mentioned something about my instance input being expected but not found or something along those lines. Literally seconds after I read the message my power cut haha

Create Input Layout returns a HRESULT that you should be checking for errors. In your current code, if that function fails, pLayout will be an uninitialized pointer, which you then pass to IASetInputLayout, setting up a crash deep inside D3D when it tries to use this garbage pointer. 

As for the cause of the failure - you're passing a hard coded '2' for the array length and the array actually contains 3 elements?

Advertisement

Have you tried setting the arraysize to 3+ for ied[]?

 

.:vinterberg:.

Hi all, thanks very much for your input here!!!!

This, is not quite right, but a step in the right direction:

image.thumb.png.29945cf2ccbcc440a2690e63b0b59839.png

It should be drawing four quads, built from two triangles a piece, I will work out the rest of the details :)

 

Thanks so much for your input! I will post again if I get stuck.

I got my quads!!

 

image.png.85d2b3e2f01f00c205f1ebdffbb8e9f7.png

After enabling the device level debug, d3d is giving me a rather cheeky error:


D3D11 WARNING: ID3D11DeviceContext::DrawInstanced: Vertex Buffer at the input vertex slot 0 is not big enough for what the Draw*() call expects to traverse. This is OK, as reading off the end of the Buffer is defined to return 0. However the developer probably did not intend to make use of this behavior.  [ EXECUTION WARNING #356: DEVICE_DRAW_VERTEX_BUFFER_TOO_SMALL]
D3D11 WARNING: ID3D11DeviceContext::DrawInstanced: Input vertex slot 0 has stride 24 which is less than the minimum stride logically expected from the current Input Layout (28 bytes). This is OK, as hardware is perfectly capable of reading overlapping data. However the developer probably did not intend to make use of this behavior.  [ EXECUTION WARNING #355: DEVICE_DRAW_VERTEX_BUFFER_STRIDE_TOO_SMALL]

Where di I set input vertex slot 0 and its stride 24? Why does it differ from the input layout?

 

I'd like to squash this while I have the opportunity.

RGB32 + RGBA32 = 28 bytes, and your Vertex struct is probably defined as two float3s (which would give 24 bytes)..?
You set the stride yourself, at "strides[0] = sizeof(Vertex);"

 

.:vinterberg:.

This topic is closed to new replies.

Advertisement