Hi,
I'm using Windows 10 and Visual Studio 2022 and C++17 with the multi-threaded MSVC Runtime as a DLL (/MDd and /MD). I'm using glfw3.3.7, glad (latest version as of 6/12/2022), and GLM 0.9.9.9.
I downloaded GLFW3 and created the test program from their quick start page. I substituted GLM in for linmath. The test program is just the standard early test thing that shows a spinning triangle.
While I was running it, my computer crashed! Out of nowhere! And when I restarted it, it went into the bios with a message saying my CPU was at a high temperature. My fans had not even kicked in! It was like somehow my computer didn't notice it was overheating!
I've turned down my overclocking and tuned my fan profile so this isn't happening as quickly, now, but the strange thing is, this program isn't causing high cpu usage at all! The cpu is sitting around at about 4% usage! (I have glfwSwapInterval(1) set.) And yet I can watch the “CPU package” temperature rising in Amory Crate! The “CPU” temperature doesn't rise so much, so the fans don't spin up, I have to manually turn up the fan speed to prevent it from overheating!
What the freak!? No game I play does this. I have an AMD5900x with a Nvidia 3060 gpu. All my off-the-shelf games run great, I can overclock, no problem. Cpu and gpu temps stay low all the time. There's something specific about This Particular Program that somehow heats up the cpu WITHOUT causing high cpu usage! This is so strange!
It can't possibly be GLM? Is it something about memory bus thrashing because of the MS _security shtuff doing extra work for every buffer?
I dunno. Grasping at straws here. I'll try compiling in release mode (as opposed to debug mode), I know GLM is a lot more efficient in the release version. I'll make sure the release version is compiling with GS-. I'll try doing a few optimizations like taking the glViewport statement out of the main render loop. But..? Really? My “CPU Package” is overheating without showing high usage? And it doesn't happen with any of my games, only with this particular OpenGL program? That's not even doing much? What could possibly be happening here?
Any ideas?
edit: I just noticed my “cpu package” temp, as displayed by Amory Crate, jumps up from like 62 to 69(!) on the very next sample after I shut off my test program. So it's like somehow my test program is preventing Amory Crate from properly sampling the cpu package temp. What? How could it possibly be doing that?