Advertisement

Can rendering without VSYNC overheat the GPU?

Started by February 16, 2011 01:27 PM
6 comments, last by Ravyne 13 years, 8 months ago
Hi!




According to news reports, the unpatched version of Starcraft II can fry your graphics card. Apparently, it renders the menu without frame limit and without VSYNC, and some GPUs might overheat because of that.

Well, users of my DirectX application report very high GPU temperature when VSYNC is disabled. The possibility that 3D rendering could destroy hardware is quite unsettling.

Do you think this is a real danger?
Generally it shouldn't be, as a graphics card that can't operate at 100% capacity (if it's not overclocked or anything) is essentially just a slower graphics card. Unfortunately it can happen.
I've had this happen to me when I try to play a very demanding game for too long. I decided to buy graphics cards without fans though, which might have been a mistake. :)

It shouldn't actually break the hardware though, unless there's something wrong with the drivers, but it might cause a blue screen or reboot.
Advertisement
Turning vsync off causes a lot more frames to be rendered in the same timespan.
I had an old graphics card that overheated when vsync was off. However, frying your gfx card because turning vsync off is clearly a HARDWARE fault. Either the clock speed is too high or there is not enough cooling. I don't know if gfx adapters have the functionality to turn themselves off when getting too hot like many CPUs.
Yes, I'm also convinced that it shouldn't happen. However, it happened to some people with Starcraft II.

My users are paying customers. I must ensure that this doesn't happen, because it would be very bad for business.

A manual framerate limit could work., but I'm really curious about this issue.

Yes, you can cap the framerate manually. This is actually a good idea since most people nowadays have LCDs that can't even display more than 60fps. You then gain processing power which you can spend on other tasks such as physics or AI.

Yes, I'm also convinced that it shouldn't happen. However, it happened to some people with Starcraft II.


I'm not sure it actually fried any cards. I don't remember any reliable report about actual damage.

GPUs however do shut down due to overheating, especially in laptops.

My users are paying customers. I must ensure that this doesn't happen[/quote]You cannot ensure that. Hardware overheats for many reasons and unless you monitor every component, some of which do not report temperature, you simply cannot do that.

This is why all EULAs have that clause about hardware damage.

A manual framerate limit could work., but I'm really curious about this issue.[/quote]It can still overheat, just for other reasons.
Advertisement

My users are paying customers. I must ensure that this doesn't happen, because it would be very bad for business.

A manual framerate limit could work., but I'm really curious about this issue.

It is not a realistic requirement.

You should do your best to prevent problems from happening, but ultimately it is not your responsibility. Waiting for vblank is the most appropriate action in most cases to prevent tearing. They turned it off and in a few circumstances it revealed problems with customer's hardware.



It is the customer's responsibility to ensure that their machine has adequate ventilation and cooling for their tasks, and otherwise is maintained properly.

There have been many different threats of lawsuits for hardware failure, and not just graphics cards. Decades ago there were similar complaints about software with heavy disk usage being the cause of disk failure.
As others have sort of said: Its not that rendering without VSync *causes* cards to overheat and die, its that rendering without VSync can cause more work for the GPU because its rendering more frames, which, combined with a hardware, software or environmental defect, can cause overheating . In any case, the chips were wither faulty, the driver didn't reign things in, or perhaps the ambient temperature was too high or the GPU was poorly ventilated due to dust or an over-stuffed case.

None of these things are your fault, however its in your best interest to do what you can to avoid these situations. Particularly, and especially if your game has low enough requirements that it should run at a steady framerate on lower-powered hardware, you could remove the option of disabling vsync, or at least make it more difficult to switch on.

Remember, disabling vsync isn't really meant as a solution to get framerates higher than a display can actually show, its intended for people who's cards can't push frames at the display rate, but can push more than some integer ratio of the display rate (eg, if the display rate is 60fps, and the users card can push only 50, then they would be artificially held back to 30fps by VSync -- this is why you get the option to turn it off: 50 fps w/ tearing might play better than 30fps w/out.)

If you want to make it more difficult to enable, so that, at least in theory, only people who know what they're doing will use it, I would add a command-line switch to your application which either enables VSync directly or which makes the VSync option available in the Settings interface. You could also make an intelligent system which monitors the framerate, and if commonly below the display's refresh rate, enables the option in the Interface and suggests to the user that they may want to try it out -- conversely, if the framerate is well above the display rate, then the system could warn the users of possible hazards and suggest that they turn vsync on.

throw table_exception("(? ???)? ? ???");

This topic is closed to new replies.

Advertisement