Advertisement

Video Card noise due to read only depth buffers

Started by December 19, 2010 06:19 AM
3 comments, last by karwosts 14 years, 2 months ago
I would to like prefix, I am not crazy, but this is an actual bug I am having with my graphics card.

In my DirectX11 application whenever I create and render a D3D11_DEPTH_STENCIL_VIEW_DESC with a .Flag set to "D3D11_DSV_READ_ONLY_DEPTH" (or |'d with) I will get an incredibly bad searing noise from my graphics card. It will basically sounds like nails on a chalkboard. I tried the typical paranoid things to fix it, aka, clean rebuild, restart pc, update drivers, and I still have it. The interesting thing is, when I run the application through PIX, I do not have it. It only happens when I run it through Visual Studio 2010. I have only tried debug as I am getting a TON of compiling errors in Release (but I plan to try to fix this next).

Has anyone ever experienced something like this before? I am on Windows7 64-bit using a Geforce 470.
I've had odd noises coming from my laptop before - especially when my app is running at a high framerate (1000s of fps), or when I've been pushing my ethernet port to its maximum through-put. Usually these sounds are high pitch squeals, and incredibly annoying, but I doubt they're anything to worry about.
Advertisement
To create pixels fast enough to satisfy modern gaming needs, graphics chip manufacturers have had to cut some corners. Read on as I explain.

Remember, to produce each pixel on the screen, it has to be first evocated by a tiny pixie inside the chip. That pixie then has to manually hand it to the pixie that knows where to place it in xy-space. To do this fast enough, many pixies would be needed at the same time, or else the user would suffer visual tearing. But pixies cost a lot. So the manufacturers solve this problem in a quite brutal way! Since pixies are magical creatures, they can with some concentration and the right incantation, be split apart momentarily and put together again, mediating the problem of being several places at once. The end result is quite painful for the pixies, but it gets the job done. This pitched screaming of pixies being split up and the endless stream of vocalized incatations that accompany them, is exactly what's causing the annoying sound you get when you push the performance envelope on a cheaper graphics cards, simply because they couldn't afford or be bothered giving pixies a proper working environment. So sad <8(=
It is I, the spectaculous Don Karnage! My bloodthirsty horde is on an intercept course with you. We will be shooting you and looting you in precisely... Ten minutes. Felicitations!
Sorry to hijack your thread but that was the best explaination I have ever heard!

******************************************************************************************
Youtube Channel

I usually notice this when I'm just starting out with applications that run at 1000fps or something ridiculous like that.

I guess you're just stressing out the display components which aren't really designed to run that fast in real world performance. Once you get to more realistic frame rates I'll assume it would go away.
[size=2]My Projects:
[size=2]Portfolio Map for Android - Free Visual Portfolio Tracker
[size=2]Electron Flux for Android - Free Puzzle/Logic Game

This topic is closed to new replies.

Advertisement