Hi Guys,
I am in the process of getting my head around playing with Geometry Shaders and so far so good.
I am understanding the geometry side of things, but am having trouble with passing the UV information through to the pixel shader.
At the moment I have this;
struct GSOutput
{
float4 pos : SV_POSITION;
float2 texcoord : TEXCOORD0;
};
[maxvertexcount(3)]
void gs_main(triangle float4 input[3] : SV_POSITION, inout TriangleStream< GSOutput > output)
{
for (uint i = 0; i < 3; i++)
{
GSOutput element;
element.pos = input[i];
element.texcoord = input[i];
output.Append(element);
}
}
Which is giving the incorrect output as per below -

If I adjust the pixel shader to fill the quads with white, I get the desired look (although without the texture, obviously).
Texture2D TextureStandard : register(t0);
SamplerState SamplerClamp : register(s0);
struct PS_Input
{
float4 position : SV_POSITION;
float2 textureCoord : TEXCOORD0;
};
float4 ps_main(PS_Input input) : SV_TARGET
{
float4 texColor = TextureStandard.Sample(SamplerClamp, input.textureCoord);
texColor.r = 1.0f; // Added to show what the geometry itself should look like.
texColor.g = 1.0f;
texColor.b = 1.0f;
texColor.a = 1.0f;
return texColor;
}

From this we can tell that in the Geometry Shader that the UV data isn't getting correctly sent to the Pixel Shader.
If I remove the Geometry Shader entirely, I get the desired output of below. Eliminating any issues in the Vertex and Pixel Shaders.

So essentially, I am wondering how you correctly pass the UV data along in the Geometry Shader?
Any help would by hugely appreciated.
Many thanks. 🙂