Advertisement

Texture Coordinates Not Recognized

Started by December 31, 2015 02:52 AM
5 comments, last by Servant of the Lord 9 years, 2 months ago

I have a HUGE problem and would really appreciate if someone could tell me how to fix it (sorry I don't have the code with me but I can describe it very well):

I'm using a texture and trying to map polygons with it, but the vertex/pixel shaders are not getting the texture coordinates. I know this because I've ruled out other problems:

- It is drawing the polygons the color of a single pixel of the texture. If I hard code the texture coordinates in the pixel shader, it gives a different color if I change the coordinates, or if I change which texture I give it to draw.

- The texture coordinates are what they're supposed to be in the data structure in C# that I'm using to store points. I checked them just before drawing a mesh.

- Also I made sure the structure format is the same size it thinks it should be (14 floats), and it matches InputLayout object I have to set up with an array of contexts (like COLOR, TEXCOORD, etc.).

In the shader file I declared a cbuffer, which just contains:

- A matrix for the view transformation

- A float4 to hold the camera position (just so I can do some calculations with that)

- A Texture2D

- SamplerState.

Then in C# I made a structure that matches those with a Matrix and a Vector4. I tried defining it with sequential and then explicit layout, and it made no difference.

I have two simple shaders in the file:

The vertex shader takes a structure for an input that contains:

- float4 POSITION

- float4 NORMAL

- float4 COLOR

- float2 TEXCOORD

The normal is for lighting purposes, and the color is because sometimes I draw them as a color instead of texture mapping, but don't worry about that part, because it works perfectly.

The output from the vertex shader and input to the pixel shader is as follows:

- float4 SV_POSITION

- float4 COLOR

- float2 TEXCOORD

The vertex shader is very simple - it only does the matrix transformation (which works), then it just copies the color and texcoord to the output (it's not even using the normal at the moment).

The pixel shader just returns a sample of the image at the texture coordinates input from the vertex shader (which should automatically be incremented between vertices by the rasterizer).

Interestingly, I have a different vertex and pixel shader that does successfully texture map. The only differences are:

- The vertex shader input structure only contains a float4 POSITION and a float2 TEXCOORD, and the vertex shader output/pixel shader input structure only contains a float4 SV_POSITION and float2 TEXCOORD (so they don't have COLOR or NORMAL).

- It doesn't have the float4 camera position in the constant buffer (and actually it doesn't even use a constant buffer - it just has the matrix, Texture2D and SamplerState declared like globals. However, I think I read that I should use constant buffers for that stuff, and ironically, I'm surprised that one even works, while this other one doesn't!).

Does anyone have any ideas? I'll try just about anything.

Texture coordinates are normally between 0.0 and 1.0. If you think the numbers are pixels, and you try to pass it 22.0 to access the 22nd pixel, it's just going to clamp you (or loop around) to 1.0.

The opposite is also true. If you are writing your shader and are expecting numbers between (0.0) and (textureSize), but the engine is giving you numbers between 0.0 and 1.0, that'd also give you the wrong results.

Regardless, to eliminate possibilities, you'll want to remove everything from the shader you possibly can: The normals, the coloration, etc... and just do the texture access to be sure you have it working.

Advertisement

Well I know they range from 0 to 1.

I'll probably need normals and other stuff. The point is, that shouldn't get in the way, as long as I'm inputting and outputting the results the correct way, and that they match what it's expecting to get, right?

Well I got it to display the textures by making a different input format. It's rather inconvenient though, because if I want to change a polygon from colored to textured, I have to destroy and recreate the stream of points.

Also, I'm worried that I won't be able to get as much information as I need into the vertex shader along with the texture coordinates.

Well I got it to display the textures by making a different input format. It's rather inconvenient though, because if I want to change a polygon from colored to textured, I have to destroy and recreate the stream of points.

Just use the same shader for the colored polygons, but assign them a pure-white texture. That's likely a faster and cleaner way then switching between the shaders.

But that's the point: when I try to supply color information and texture coordinates into one vertex format, it can't find the texture coordinates and I think it always reads it as 0. That's the whole problem I've been having.

Advertisement

Post the actual code. Show us how you are uploading the vertex data and show us the actual shaders.

Not "similar" code, not "example" code, and not descriptions of the code, but post the actual code. Maybe we'll spot something.

[Edit:] Didn't realize you were using DirectX, which is unfamiliar to me, so someone else will have to glance over the code you post, but you should still post it.

This topic is closed to new replies.

Advertisement