Advertisement

Trouble with depth linearization for oblique view frustums

Started by July 15, 2018 02:05 AM
0 comments, last by fgp069 6 years, 6 months ago

Hi there, for the past while I've been working on a deferred renderer using OpenGL and I've implemented planar reflections utilizing the stencil buffer. To prevent drawing objects behind the reflection plane I use the brilliant Oblique View Frustum Depth Projection and Clipping technique which has performed very well to solve that issue. However in various shaders I use require linearizing the depth buffer which has proven itself to be quite a burden for these oblique frustums. Fortunately I've come across this article here which thankfully provides a solution, although I haven't been very successful with incorporating it in my own project. I admit my understanding of complex matrix maths are lacking and much of the detail in the article isn't the most comprehensible to me. I've done a fair bit of searching online and there doesn't seem to be any working examples available out there using this technique.

I wrote a basic depth buffer visualization shader to test out my implementation and so far the results are quite bizarre (seems like the inverse of what I should expect), so I'm quite certain that I've either overlooked something and/or the maths are incorrect. I'm wondering could anybody experienced with matrix math could take a look at my code and see if they notice anything off? Any help or possible insight would be greatly appreciated.

I'll include the relevant code here and a brief clip showcasing the issue (sorry about the tiny size).

Thanks!

 

This topic is closed to new replies.

Advertisement