I'm trying to create a streaming server which on command renders a frame (using DirectX11) and sends it out as a h264 packet (using nvenc). What I'm noticing is that if don't call the IDXGISwapChain::Present function to display the results to a window then my performance slows down. Is there some kind of performance penalty for off screen only rendering? Is there anyway to avoid this?
Offscreen rendering performance
Also if i display the frame rate in a second window then I can see it go down when I stop calling IDXGISwapChain::Present for the first window . This is all with vsync disabled.
Doesn't make sense. How exactly are you timing your code? If you are using an FPS counter from Nvidia overlay or AMD, then yes its based of presenting images to the screen.
If you want to you can render 4k offscreen, but your main window be something very small like 32x32 and just present a simple small output just so that you are updating something and your FPS overlay is happy that it is receiving updates.
NBA2K, Madden, Maneater, Killing Floor, Sims
@dpadam450 I have a web page that samples the time and sends a request for a h264 frame to the main program that receives the request, renders the frame and sends back the encoded frame. The web page sends the request 30 times a second, averages the response times and updates the html output once every second. If i call swapchain Present (even if i'm just displaying a blank screen) continuously in the main program the response time cuts in half. I'm feeling like this is some kinda windows driver thing the prioritizes programs the are actively rendering to the screen.