CG Depth Buffer Problem
Hello,
I'm, writing a simple shader in CG, whos job is printing the Z component of a position in clip space. I render this to a texture, and there I see something strange with the depth.
If you go to http://foto.webalice.it/emanuele.russo
you can find a picture. I din't know how to attach here... well...
The picture shows the output of the depth shader. It is a simple torus, made of 16 vertices, the camera is watching straight to a corner, and the torus is a bit rotated upward. Grey means near, white means far.
Well the code of the shader is the following:
// VERTEX
void v_Unlit
(
in float3 kModelPosition : POSITION,
out float4 kClipPosition : POSITION,
out float fDepth : TEXCOORD1,
uniform float4x4 WVPMatrix
)
{
// Transform the position from model space to clip space.
float4 kHModelPosition = float4(kModelPosition,1.0f);
kClipPosition = mul(kHModelPosition,WVPMatrix);
fDepth = kClipPosition.z;
}
// PIXEL
void p_Unlit
(
in float fDepth : TEXCOORD1,
out float4 kPixelColor : COLOR
)
{
kPixelColor.rgb = fDepth;
kPixelColor.a = 1.0f;
}
The error in the picture shows a white area that in my opinion shoud be gray, following the edge of the torus, but there is a white zone... meaning that the internal side passed in front view!
Is it a known problem and I'm so ignorant that I don't know it or do you need other details of the situation?
( Exclude the device problem, I have the same on Nvdia and Ati )
Thanks a lot
Emanuele
It looks like the actual z-buffering is not working.
Make sure you have a z buffer and have turned it on with the right z-test func (GL_LEQUAL).
Make sure you have a z buffer and have turned it on with the right z-test func (GL_LEQUAL).
www.flashbang.se | www.thegeekstate.com | nehe.gamedev.net | glAux fix for lesson 6 | [twitter]thegeekstate[/twitter]
Thanks,
I checked that thing, the z-buffer seems ok.
I never touched it, so it is set to the default value of GL_LEQUAL and is enabled: I tried to change that value , for example with GL_LESS remained the same, with GL_NEVER everything grey.
The problem appears during a render to texture... is it possible that the z buffer goes disabled during that kind of rendering?
Thanks
I checked that thing, the z-buffer seems ok.
I never touched it, so it is set to the default value of GL_LEQUAL and is enabled: I tried to change that value , for example with GL_LESS remained the same, with GL_NEVER everything grey.
The problem appears during a render to texture... is it possible that the z buffer goes disabled during that kind of rendering?
Thanks
Are you using FBO's? if so then remember that you have to create a new depth buffer for each FBO that will have z-testing.
Alternatively it could be faulty polygonal winding or face culling
Alternatively it could be faulty polygonal winding or face culling
www.flashbang.se | www.thegeekstate.com | nehe.gamedev.net | glAux fix for lesson 6 | [twitter]thegeekstate[/twitter]
Yes, I'm actually using FBO, but I'm modifying a code written by another person...
ah a z-buffer per fbo? ok now I try, I'll let you know soon
Thanks
ah a z-buffer per fbo? ok now I try, I'll let you know soon
Thanks
Well probably we're next to the solution.
When I run the shader to the screen, depth is correct. When I Enable rendering to texture there is the error on the texture (I save it on a file).
Now, my engine is Wild Magic, and uses OpenGL. Actually I don't know if it uses FBOs, because I can't see any command to instanciate a new DepthBuffer, I only have a framebuffer class where I set the values for the clearing of the buffers, you know stencil, depth, and frame, and to enable the offscreen framebuffer calls something like glBindFrameBufferEXT.
The strange is that when I change the value of clear depth and I put it to 0.0, for example, when I draw on screen I see everything grey, correctly, while in the offscreen render I still see my image with the error.
It is clear that z-buffer is not cleared... isn't it?
In opengl how would you do this? How would you set up an offscreen rendering with the correct z-buffer? With your help I could check if my engine does the correct steps!
Thanks
Emanuele
When I run the shader to the screen, depth is correct. When I Enable rendering to texture there is the error on the texture (I save it on a file).
Now, my engine is Wild Magic, and uses OpenGL. Actually I don't know if it uses FBOs, because I can't see any command to instanciate a new DepthBuffer, I only have a framebuffer class where I set the values for the clearing of the buffers, you know stencil, depth, and frame, and to enable the offscreen framebuffer calls something like glBindFrameBufferEXT.
The strange is that when I change the value of clear depth and I put it to 0.0, for example, when I draw on screen I see everything grey, correctly, while in the offscreen render I still see my image with the error.
It is clear that z-buffer is not cleared... isn't it?
In opengl how would you do this? How would you set up an offscreen rendering with the correct z-buffer? With your help I could check if my engine does the correct steps!
Thanks
Emanuele
I had a look to the steps of creation of a FBO in my engine:
to create a new one, he does:
//Generate new buffer
glGenFramebuffersEXT(1,&m_uiFrameBufferID);
//Try to enable it
glBindFramebufferEXT(GL_FRAMEBUFFER_EXT,m_uiFrameBufferID);
//Attach texture to frame buffer
glFramebufferTexture2DEXT(GL_FRAMEBUFFER_EXT,GL_COLOR_ATTACHMENT0_EXT,
GL_TEXTURE_2D,m_uiTargetID,0);
GLenum uiStatus = glCheckFramebufferStatusEXT(GL_FRAMEBUFFER_EXT);
int iStopHere;
switch (uiStatus)
{
case GL_FRAMEBUFFER_COMPLETE_EXT:
iStopHere = 0;
glBindTexture(GL_TEXTURE_2D,0);
// Re-enable default one
glBindFramebufferEXT(GL_FRAMEBUFFER_EXT,0);
return true;
No mention about depth buffer... what are the commands to - as you said - attach a depth buffer to each FBO ?
Thanks again
Emanuele Russo
to create a new one, he does:
//Generate new buffer
glGenFramebuffersEXT(1,&m_uiFrameBufferID);
//Try to enable it
glBindFramebufferEXT(GL_FRAMEBUFFER_EXT,m_uiFrameBufferID);
//Attach texture to frame buffer
glFramebufferTexture2DEXT(GL_FRAMEBUFFER_EXT,GL_COLOR_ATTACHMENT0_EXT,
GL_TEXTURE_2D,m_uiTargetID,0);
GLenum uiStatus = glCheckFramebufferStatusEXT(GL_FRAMEBUFFER_EXT);
int iStopHere;
switch (uiStatus)
{
case GL_FRAMEBUFFER_COMPLETE_EXT:
iStopHere = 0;
glBindTexture(GL_TEXTURE_2D,0);
// Re-enable default one
glBindFramebufferEXT(GL_FRAMEBUFFER_EXT,0);
return true;
No mention about depth buffer... what are the commands to - as you said - attach a depth buffer to each FBO ?
Thanks again
Emanuele Russo
Solved!
The engine didn't create a depth buffer per FBO!
I inserted this lines:
GLuint depthbuffer;
glGenRenderbuffersEXT(1, &depthbuffer);
glBindRenderbufferEXT(GL_RENDERBUFFER_EXT, depthbuffer);
glRenderbufferStorageEXT(GL_RENDERBUFFER_EXT, GL_DEPTH_COMPONENT, width, height);
glFramebufferRenderbufferEXT(GL_FRAMEBUFFER_EXT, GL_DEPTH_ATTACHMENT_EXT, GL_RENDERBUFFER_EXT, depthbuffer);
To create a depth buffer per FBO and it workes!
I found I very interesting article about it here on gamedev:
http://www.gamedev.net/reference/articles/article2331.asp
The engine didn't create a depth buffer per FBO!
I inserted this lines:
GLuint depthbuffer;
glGenRenderbuffersEXT(1, &depthbuffer);
glBindRenderbufferEXT(GL_RENDERBUFFER_EXT, depthbuffer);
glRenderbufferStorageEXT(GL_RENDERBUFFER_EXT, GL_DEPTH_COMPONENT, width, height);
glFramebufferRenderbufferEXT(GL_FRAMEBUFFER_EXT, GL_DEPTH_ATTACHMENT_EXT, GL_RENDERBUFFER_EXT, depthbuffer);
To create a depth buffer per FBO and it workes!
I found I very interesting article about it here on gamedev:
http://www.gamedev.net/reference/articles/article2331.asp
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement