Pbuffers and reading the depth?
I check to see if somthing is infront of a given poitn by doing the following. I do a gluProject with the point to get the screen x and y, and the depth at that point. Then I do a glReadPixels at that same x and y, getting the depth of anything drawn thus far. If the the glu's depth is less than the one i get from the glReadPixels i draw.
The only problem is when I do this with a pbuffer the depth vaules i'm getting from glReadPixels are compleately off, some times it will be 0, other times it will be some value other than it should.
Does any one have any idea what could be causing this?
How are you creating PBuffer? Does it have valid depth component?
You should never let your fears become the boundaries of your dreams.
just a thought:
the second param of wglMakeContextCurrentARB is the current read target, is it the same as the draw target?
Yeah for the pbuffer i have the read and the writing to the "front."
I create it using the recommended settings from the ATI paper:
Is there somthing wrong with this?
wglMakeContextCurrentARB(hPBufferDC, hPBufferDC, hPBufferRC);glDrawBuffer(GL_FRONT);glReadBuffer(GL_FRONT);
I create it using the recommended settings from the ATI paper:
int attr[] = { WGL_SUPPORT_OPENGL_ARB, TRUE, // pbuffer will be used with gl WGL_DRAW_TO_PBUFFER_ARB, TRUE, // enable render to pbuffer //WGL_BIND_TO_TEXTURE_RECTANGLE_RGB_NV, TRUE, // non power of 2 WGL_BIND_TO_TEXTURE_RGBA_ARB, TRUE, // pbuffer will be used as a texture WGL_RED_BITS_ARB, 8, // at least 8 bits for RED channel WGL_GREEN_BITS_ARB, 8, // at least 8 bits for GREEN channel WGL_BLUE_BITS_ARB, 8, // at least 8 bits for BLUE channel WGL_ALPHA_BITS_ARB, 8, // at least 8 bits for ALPHA channel WGL_DEPTH_BITS_ARB, 24, // at least 24 bits for depth buffer WGL_DOUBLE_BUFFER_ARB, FALSE, // we don’t require double buffering 0 // zero terminates the list };
Is there somthing wrong with this?
Any one else have any ideas?
Does this maybe not work with WGL_BIND_TO_TEXTURE_RGBA_ARB?
Does this maybe not work with WGL_BIND_TO_TEXTURE_RGBA_ARB?
I just did a quick test and it appears on my graphics card (GeForce FX 5600 w/ 66.93 drivers) glReadPixels will only return valid depth information from a PBuffer if GL_DEPTH_TEST is enabled on the PBuffer.
Enigma
Enigma
That is odd Enigma, but alas I made sure it was enabled for when i read the pixel and it still is giving me bogus values :(
A bit of further testing shows it needs to be enabled for the drawing as well. If you don't want depth testing it looks like you need to enable depth testing and just use a depth func of GL_ALWAYS.
Enigma
Enigma
Thanks Enigma for spending the time to help me out.
Ok, i draw most of my stuff using depthtesting, but some of the "after effect" billboarded stuff I add, after I draw everything (and am testing the depth to see if i should draw using glReadPixels) I don't use depth testing. Would this cause the read pixels to return bad depths just by drawing these last few things without depth testing enabled?
Ok, i draw most of my stuff using depthtesting, but some of the "after effect" billboarded stuff I add, after I draw everything (and am testing the depth to see if i should draw using glReadPixels) I don't use depth testing. Would this cause the read pixels to return bad depths just by drawing these last few things without depth testing enabled?
On my system anything that is rendered with glDepthTest disabled does not write to the depth buffer of the PBuffer. So for example if I render a screen size quad at depth 0.5 with depth test enabled, then disable the depth test and draw a screen size quad with depth at 0.25 then renable the depth test and read a value I get 0.5. If I leave depth testing enabled when I render the second quad I get 0.25.
The same thing actually happens when rendering to the screen instead of a PBuffer, and I think I remember reading that this is part of the spec (i.e. disabling the depth test implicitly turns off depth writes), but I haven't been able to find anything to confirm it.
EDIT: Found it. From the OpenGL 2.0 spec:
(Emphasis mine)
Enigma
The same thing actually happens when rendering to the screen instead of a PBuffer, and I think I remember reading that this is part of the spec (i.e. disabling the depth test implicitly turns off depth writes), but I haven't been able to find anything to confirm it.
EDIT: Found it. From the OpenGL 2.0 spec:
Quote: The depth buffer test discards the incoming fragment if a depth comparison fails. The comparison is enabled or disabled with the generic Enable and Disable commands using the symbolic constant DEPTH TEST. When disabled, the depth comparison and subsequent possible updates to the depth buffer value are bypassed and the fragment is passed to the next operation.
(Emphasis mine)
Enigma
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement