I'm currently working on some old DirectX 9 code and ran into an issue with anti-aliasing. The game works perfectly fine when running it normally, both on AMD and NVIDIA graphics cards (tested 4 cards in total). Without changing any code at all but only setting m_d3dpp.MultiSampleType to something other than D3DMULTISAMPLE_NONE, the game still works as expected with anti-aliasing visibly active on the AMD cards but the entire scene breaks on the NVIDIA cards. There everything looks like z writing is completely disabled, pixels being drawn in order of the draw calls no matter if they are behind something. I already use CheckDeviceMultiSampleType to detect which settings are supported, and tried to manually set both the format and quality level to different values. All give me the same results. The game also checks various functions such as clearing the z buffer on errors, all succeed as if nothing is broken. Any ideas on what could cause this device specific issue?
Anti-aliasing breaks depth/z writing on NVIDIA cards
could you post a screenshot?
don't forget to enable the d3d9 debug mode, it can not only report (and break on) errors, but also warnings, which can hint to issues.
Could be also a driver bug. If you can, try it on some intel IGP.
2 hours ago, ProfL said:could you post a screenshot?
don't forget to enable the d3d9 debug mode, it can not only report (and break on) errors, but also warnings, which can hint to issues.
Could be also a driver bug. If you can, try it on some intel IGP.
This is what it looks like on the NVIDIA cards. The scene is rendered front to back. Other DirectX 9 games run fine on these cards though, but not my project.
I tried running the app using the debug library and dll but then it only shows some general info in the console on start. I don't think I can enable the full debug runtime as all machines are running Windows 10, the option for debug mode is disabled in the control panel.
Good idea, but no results unfortunately. I ran it on the Intel IGP on 2 machines and both worked fine.
it looks like the depth buffer is not bound at all. Could it be that you've created the color with MSAA, but depth without?
Also, try to set the render states for depth (again) after setting the surfaces.
I'm using EnableAutoDepthStencil and not doing any manual work on the depth buffer. This should be sufficient right?
The render states are already being set right before drawing the geometry since they get switched on and off between 3D and GUI rendering.
EDIT:
I figured out the problem. Turned out that I was applying multisampling to the backbuffer and auto depth buffer while rendering the scene to a texture surface. I didn't know this case should be handled differently. I'm now applying multisampling to a seperate render target and using StretchRect to copy it to the scene texture. I also had to create a manual depth surface with multisampling because I work with multiple render targets for my shaders, only 1 of those is using multisampling.
It's still unclear to me as to why everything worked on AMD and Intel hardware, and not on NVIDIA, but everything is working perfectly now. Thanks for the ideas!