Advertisement

Detecting hardware capabilities

Started by April 22, 2003 02:48 PM
6 comments, last by WIlk 21 years, 10 months ago
Is there a way to determine the capabilities (multitexture support/triple buffering/max resolution and color bits/driver version ect.) of the graphics accelerator using GDI or OpenGL functions? Or how to check for OpenGL support?
For extensions support(multitexturing, vbo, vp...) just check extension string. you can get more detailed information by glGet*(). For color bits you check pixel format descriptor.

btw: there is no triple buffering in opengl

You should never let your fears become the boundaries of your dreams.
You should never let your fears become the boundaries of your dreams.
Advertisement
quote:
Original post by _DarkWIng_
For extensions support(multitexturing, vbo, vp...) just check extension string. you can get more detailed information by glGet*(). For color bits you check pixel format descriptor.



That tests for driver support, but not for hardware acceleration. There's no easy way to check for hardware accelration short of benchmarking, unfortunately.

quote:

btw: there is no triple buffering in opengl



When you request 'double' buffering in opengl you can receive either double -or- triple buffering depending on driver settings. You're right in that there is no way to pick which one OpenGL will use, though.



[edited by - cheesegrater on April 22, 2003 4:42:49 PM]
Thanks
quote:
Original post by CheeseGrater
There''s no easy way to check for hardware accelration short of


correction:

after ChoosePixelFormat, use DiscribePixelFormat, and from the pixelformatdiscriptor returned, check the flags for PFD_GENERIC_FORMAT. This will indicate no hardware acceleration.


| - Project-X - my mega project.. close... - | - adDeath - | - email me - |

| - When it comes to programming, the most obvious answer is usually the wrong one - |
nVIDIA drivers will announce version 1.4 of OpenGL.
Version 1.4 requires 3D texturing.
Thus, you can use 3D texturing even on a GeForce 2.
Unfortunately, it''ll run in software if you do.
The pixel format will not be "generic," as it still is the nVIDIA driver handling it.

The solution, in this case, is to specifically look for the EXT_texture_3d extension, as that''s only announced on hardware that can actually do it in hardware.

In general, if you provide a trade-off or fallback path, you should make that trade-off available to the user, too, because different users make different nice-vs-fast trade-offs. Thus, it''s often better to choose conservative defaults, and let the user choose to turn on/off various things if they want to.
Advertisement
Thanks again. You''ve all been very helpful. Now, how about this: is there a way to draw contents of a buffer in a window using GDI functions only, i mean without any extensions (neither OpenGL nor DX)?
quote:
Original post by RipTorn
Original post by CheeseGrater
There's no easy way to check for hardware accelration short of


correction:

after ChoosePixelFormat, use DiscribePixelFormat, and from the pixelformatdiscriptor returned, check the flags for PFD_GENERIC_FORMAT. This will indicate no hardware acceleration.



True, but I was discusssing figuring out wether a given extension is hardware accelerated, not the basic OpenGL functions.

[edited by - cheesegrater on April 23, 2003 10:05:45 AM]

This topic is closed to new replies.

Advertisement