Advertisement

Beginning DirectX, Ortho?

Started by August 24, 2018 03:37 PM
28 comments, last by JWColeman 6 years, 3 months ago
6 minutes ago, Zakwayda said:

Unfortunately I'm not using Direct3D currently and haven't for quite a while, so I may not be able to help with the particulars of your problem.

I do have a suggestion though. If I were you I'd try to find a simple, self-contained tutorial with complete working source code that does the basics (renders a simple primitive with a matrix transform, etc.). Get a project working from it, copy-pasting or using the code verbatim if needed. Assuming you can get it working, then you'll have some reference code that you know works (assuming the tutorial is sound). You can then try to build your own code from there, and if things go wrong, you'll have working code to check against. 

Anyway, just a suggestion. (Of course it's also possible someone who's up to speed with Direct3D will be able to answer your question here.)

I agree on finding a self contained tutorial, this is tricker than you think it would be.

The rastertek tutorial is huge, I could probably attempt to compile it and tear it apart to get to the bottom of this mess, that could take a while..

Persistence I've found is always a large part of success in learning to program anything :D

I still strongly suggest compiling the shader using the HLSL compiler included in Visual Studio as I explained in my previous post. That has a lot of advantages and you actually see detailed error messages when something is wrong. It is also the correct way to do it.

Advertisement
24 minutes ago, Magogan said:

I still strongly suggest compiling the shader using the HLSL compiler included in Visual Studio as I explained in my previous post. That has a lot of advantages and you actually see detailed error messages when something is wrong. It is also the correct way to do it.

Hi Magogan, the only trouble I'm having with following the steps in the link is D3DReadFileToBlob, its undefined, and I'm not sure what to include to define it.


std::ifstream vs_stream;
	
vs_stream.open(FileName, std::ifstream::in | std::ifstream::binary);
if (vs_stream.good()){
		
	vs_stream.seekg(0, std::ios::end);
	auto vs_size = size_t(vs_stream.tellg());
	auto vs_data = new char[vs_size];
	vs_stream.seekg(0, std::ios::beg);
	vs_stream.read(vs_data, vs_size);
	assert(vs_stream.good() && vs_stream.gcount() == vs_size);
	vs_stream.close();
	
	//Initialize shader here using pDevice->CreateVertexShader, pDevice->CreateInputLayout, pDevice->CreatePixelShader, ...
	//pDevice is your graphics device (of type ID3D11Device*)
		
	delete[] vs_data;
	
}

I'm sorry, I googled it and it uses the D3DCompiler API. However, you can simply use the code above without including any additional header or DLL.

13 minutes ago, Magogan said:


std::ifstream vs_stream;
	
vs_stream.open(FileName, std::ifstream::in | std::ifstream::binary);
if (vs_stream.good()){
		
	vs_stream.seekg(0, std::ios::end);
	auto vs_size = size_t(vs_stream.tellg());
	auto vs_data = new char[vs_size];
	vs_stream.seekg(0, std::ios::beg);
	vs_stream.read(vs_data, vs_size);
	assert(vs_stream.good() && vs_stream.gcount() == vs_size);
	vs_stream.close();
	
	//Initialize shader here using pDevice->CreateVertexShader, pDevice->CreateInputLayout, pDevice->CreatePixelShader, ...
	//pDevice is your graphics device (of type ID3D11Device*)
		
	delete[] vs_data;
	
}

I'm sorry, I googled it and it uses the D3DCompiler. However, you can simply use the code above without including any additional header or DLL.

Hi Magogan, thanks for the code snippet, this all works once I include fstream and assert.h

 

However, not sure how to createvertexshader without the blob VS?

Is it, like this?

        //Initialize shader here using pDevice->CreateVertexShader, pDevice->CreateInputLayout, pDevice->CreatePixelShader, ...
        dev->CreateVertexShader(vs_data, vs_size, NULL, &pVS);

And, the create input layout... like so?

 

        // create the input layout object
        D3D11_INPUT_ELEMENT_DESC ied[] =
        {
            { "POSITION", 0, DXGI_FORMAT_R32G32B32_FLOAT, 0, 0, D3D11_INPUT_PER_VERTEX_DATA, 0 },
        { "COLOR", 0, DXGI_FORMAT_R32G32B32A32_FLOAT, 0, 12, D3D11_INPUT_PER_VERTEX_DATA, 0 },
        };

        dev->CreateInputLayout(ied, 2, vs_data, vs_size, &pLayout);
        devcon->IASetInputLayout(pLayout);

 

Then, when I'm done there, repeat the process for the pixel shader.

Okay. Ill work on this today.

Thanks Magogan, I think I got it!

 


void Renderer::InitPipeline()
{
	std::ifstream vs_stream;

	vs_stream.open("VertexShader.cso", std::ifstream::in | std::ifstream::binary);

	if (vs_stream.good()) {

		vs_stream.seekg(0, std::ios::end);
		auto vs_size = size_t(vs_stream.tellg());
		auto vs_data = new char[vs_size];
		vs_stream.seekg(0, std::ios::beg);
		vs_stream.read(vs_data, vs_size);
		assert(vs_stream.good() && vs_stream.gcount() == vs_size);
		vs_stream.close();

		//Initialize shader here using pDevice->CreateVertexShader, pDevice->CreateInputLayout, pDevice->CreatePixelShader, ...
		dev->CreateVertexShader(vs_data, vs_size, NULL, &pVS);
		//pDevice is your graphics device (of type ID3D11Device*)
		// create the input layout object
		D3D11_INPUT_ELEMENT_DESC ied[] =
		{
			{ "POSITION", 0, DXGI_FORMAT_R32G32B32_FLOAT, 0, 0, D3D11_INPUT_PER_VERTEX_DATA, 0 },
		{ "COLOR", 0, DXGI_FORMAT_R32G32B32A32_FLOAT, 0, 12, D3D11_INPUT_PER_VERTEX_DATA, 0 },
		};

		dev->CreateInputLayout(ied, 2, vs_data, vs_size, &pLayout);


		delete[] vs_data;

	}

	vs_stream.open("PixelShader.cso", std::ifstream::in | std::ifstream::binary);

	if (vs_stream.good()) {

		vs_stream.seekg(0, std::ios::end);
		auto vs_size = size_t(vs_stream.tellg());
		auto vs_data = new char[vs_size];
		vs_stream.seekg(0, std::ios::beg);
		vs_stream.read(vs_data, vs_size);
		assert(vs_stream.good() && vs_stream.gcount() == vs_size);
		vs_stream.close();

		//Initialize shader here using pDevice->CreateVertexShader, pDevice->CreateInputLayout, pDevice->CreatePixelShader, ...
		dev->CreatePixelShader(vs_data, vs_size, NULL, &pPS);
		//pDevice is your graphics device (of type ID3D11Device*)
		delete[] vs_data;

	}
	// set the shader objects
	devcon->IASetInputLayout(pLayout);
	devcon->VSSetShader(pVS, 0, 0);
	devcon->PSSetShader(pPS, 0, 0);

}

Now that it works, maybe you can explain how it helps me :D

 

Wait a minute, not working yet, it doesn't appear to be drawing my quad :(

Advertisement

Well, it just reads the compiled shader from the file and loads it. You may want to structure it a little better so you can reuse the code for loading multiple shaders in the future.

The vertex coordinates expected by the orthographic matrix are in pixels I think, from -0.5*screen size to 0.5*screen size, I just saw you used 0.5f as coordinates - you have to use e.g. -100 to 99 for a 200x200 px rectangle if I'm not mistaken.

5 minutes ago, Magogan said:

Well, it just reads the compiled shader from the file and loads it. You may want to structure it a little better so you can reuse the code for loading multiple shaders in the future.

The vertex coordinates expected by the orthographic matrix are in pixels I think, from -0.5*screen size to 0.5*screen size, I just saw you used 0.5f as coordinates - you have to use e.g. -100 to 99 for a 200x200 px rectangle if I'm not mistaken.



Hi Magogan, I cleared the ortho code because I wasn't sure it was working, can you tell me, is vs_data the right parameter to pass for data to CreateVertexShader and CreatePixelShader?

I finally got this thing to work!

 

So, I had to set a constant buffer, I kept everything very simple.. This is my C++ representation of my Shader cbuffer


	struct ConstantBuffer
	{
		D3DXMATRIX projection;
	};

And here is my pointer to the constant buffer, as well as the matrix I will create using directx:


ID3D11Buffer *pCBuffer;				   // the constant buffer
D3DXMATRIX orthoMatrix;

 

Here is the shader cbuffer:


cbuffer ConstantBuffer : register(b0)
{
	matrix projection;
}

 

Now, I had to map this stuff, this tutorial here was extremely helpful: https://docs.microsoft.com/en-us/windows/desktop/direct3d11/overviews-direct3d-11-resources-buffers-constant-how-to

I was able to translate their example into my own code:


	ConstantBuffer cbuffer;
	D3DXMatrixOrthoLH(&orthoMatrix, SCREEN_WIDTH, SCREEN_HEIGHT, 0, 1);
	cbuffer.projection = orthoMatrix;

	// Fill in a buffer description.
	D3D11_BUFFER_DESC cbDesc;
	cbDesc.ByteWidth = sizeof(ConstantBuffer);
	cbDesc.Usage = D3D11_USAGE_DYNAMIC;
	cbDesc.BindFlags = D3D11_BIND_CONSTANT_BUFFER;
	cbDesc.CPUAccessFlags = D3D11_CPU_ACCESS_WRITE;
	cbDesc.MiscFlags = 0;
	cbDesc.StructureByteStride = 0;


	// Fill in the subresource data.
	D3D11_SUBRESOURCE_DATA InitData;
	InitData.pSysMem = &cbuffer;
	InitData.SysMemPitch = 0;
	InitData.SysMemSlicePitch = 0;

	dev->CreateBuffer(&cbDesc, &InitData, &pCBuffer);
	devcon->VSSetConstantBuffers(0, 1, &pCBuffer);

Then, my updated shader code to apply the transformation:


VOut VShader(float4 position : POSITION, float4 color : COLOR)
{
	VOut output;

	output.position = mul(position, projection);
	output.color = color;

	return output;
}

 

Now, instead of using relative coordinates for my triangle vertices, I use some pixel coordinates:


	// create a triangle using the VERTEX struct
	Vertex OurVertices[] =
	{
		{ D3DXVECTOR2(0, 100), D3DXCOLOR(1.0f, 0.0f, 0.0f, 1.0f) },
		{ D3DXVECTOR2(100, -100), D3DXCOLOR(0.0f, 1.0f, 0.0f, 1.0f) },
		{ D3DXVECTOR2(-100, -100), D3DXCOLOR(0.0f, 0.0f, 1.0f, 1.0f) }
	};

image.thumb.png.622af00bf5835cfd61bca25ab2893acb.png

 

This topic is closed to new replies.

Advertisement