My nothing-to-write-home-about OpenGL webpage. (please pardon the popups!)
Stupid question (I should know this)
When you just up and start writing OpenGL code, like NeHe''s lesson 1, are you hardware accelerated by default? And what about mipmapping - if you do like Nehe''s lesson I think 6 and build 2d mipmaps, is that using video memory by default? If not, how do you switch between software and hardware mode?
Are there any tutorials or articles on this subject?
Thanks
Love means nothing to a tennis player
Love means nothing to a tennis player
My nothing-to-write-home-about OpenGL webpage. (please pardon the popups!)
This article may have answered my own question:
http://www.gamedev.net/reference/programming/features/oglext/
Amazing what happens when you actually look for stuff instead of just posting a question isn''t it? :/
Love means nothing to a tennis player
http://www.gamedev.net/reference/programming/features/oglext/
Amazing what happens when you actually look for stuff instead of just posting a question isn''t it? :/
Love means nothing to a tennis player
My nothing-to-write-home-about OpenGL webpage. (please pardon the popups!)
Love means nothing to a tennis player
My nothing-to-write-home-about OpenGL webpage. (please pardon the popups!)
Hmm, actually this article doesn''t directly answer my question, unless it''s telling me that OpenGL 1.1 doesn''t support hardware acceleration at all, and I don''t believe that to be the case.
I''ll keep looking for an article on what I want to know - if anyone happens to know of one that they wouldn''t mind posting I''d really appreciate it.
Love means nothing to a tennis player
I''ll keep looking for an article on what I want to know - if anyone happens to know of one that they wouldn''t mind posting I''d really appreciate it.
Love means nothing to a tennis player
My nothing-to-write-home-about OpenGL webpage. (please pardon the popups!)
Love means nothing to a tennis player
My nothing-to-write-home-about OpenGL webpage. (please pardon the popups!)
Here''s another:
http://www.opengl.org/resources/faq/technical/mswindows.htm#0020
It kind of looks like the only power you have to turn hardware acceleration on and off is to choose the pixel format thusly - if you choose a pixel format that your hardware supports, it''ll be hardware accelerated and you can''t disable it.
I guess that''s fine though
.
Love means nothing to a tennis player
http://www.opengl.org/resources/faq/technical/mswindows.htm#0020
It kind of looks like the only power you have to turn hardware acceleration on and off is to choose the pixel format thusly - if you choose a pixel format that your hardware supports, it''ll be hardware accelerated and you can''t disable it.
I guess that''s fine though

Love means nothing to a tennis player
My nothing-to-write-home-about OpenGL webpage. (please pardon the popups!)
Love means nothing to a tennis player
My nothing-to-write-home-about OpenGL webpage. (please pardon the popups!)
As far as I know, openGL is completely hardware accelerated, and a software renderer doesn''t use openGL at all.
The Love Of Trees
OpenGL does drop into a software mode if your hardware doesn''t support the hardware accelerated features. Don''t quote me but I believe opengl doesn''t support an accelerated 24 bit colour mode and example of this is running Blender on older hardware, you see the difference quite quickly.
Also, I get a large drop in frame rate in my own opengl applications when I switch from 16 bit to 32 bit colour depth. As far as I know the ATI M1 cipset on my laptop doesn''t support accelerated 32 colour.
Anyway!...drifting off there!...hope that was of some use.
GCoder
Also, I get a large drop in frame rate in my own opengl applications when I switch from 16 bit to 32 bit colour depth. As far as I know the ATI M1 cipset on my laptop doesn''t support accelerated 32 colour.
Anyway!...drifting off there!...hope that was of some use.
GCoder
GCoder
I have an old PII450 laptop with an 8MB videocard that I''m confident doesn''t accelerate a bloody thing, and I can run OpenGL on it, but forget about frames per second, it''s more like seconds per frame
.
So I''m pretty sure OpenGL does have software rendering if necessary.
This question was more curiosity than anything else.
Love means nothing to a tennis player

So I''m pretty sure OpenGL does have software rendering if necessary.
This question was more curiosity than anything else.
Love means nothing to a tennis player
My nothing-to-write-home-about OpenGL webpage. (please pardon the popups!)
Love means nothing to a tennis player
My nothing-to-write-home-about OpenGL webpage. (please pardon the popups!)
there is the microsoft reference rasterizer (software opengl driver)
you can test for it easily by seeing if the gl vendor string is microsoft, or, you can do the following after your call to ChoosePixelFormat...
However, some parts of the GL rendering pipeline will individually be run in software if the hardware isn't there. A simple example is Vertex shaders on Nvidia cards. On hardware older than geforce 3's, they always run on the CPU, which pretty much makes them useless (especially if your also using, say, VBO). But there are more exteme cases, for example with an older nvidia card, if you use all it's texture units, and a hardware clip plane, it used to clip the triangle data on the CPU... There are other examples too.
[edited by - RipTorn on February 18, 2004 9:16:46 PM]
you can test for it easily by seeing if the gl vendor string is microsoft, or, you can do the following after your call to ChoosePixelFormat...
PIXELFORMATDESCRIPTOR GLpfd; ... set GLpfd ... PixelFormatID= ChoosePixelFormat(hDC, &GLpfd); DescribePixelFormat(hDC,PixelFormatID,sizeof(GLpfd),&GLpfd); if (GLpfd.dwFlags & PFD_GENERIC_FORMAT) software rasterizer....
However, some parts of the GL rendering pipeline will individually be run in software if the hardware isn't there. A simple example is Vertex shaders on Nvidia cards. On hardware older than geforce 3's, they always run on the CPU, which pretty much makes them useless (especially if your also using, say, VBO). But there are more exteme cases, for example with an older nvidia card, if you use all it's texture units, and a hardware clip plane, it used to clip the triangle data on the CPU... There are other examples too.
[edited by - RipTorn on February 18, 2004 9:16:46 PM]
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement
Recommended Tutorials
Advertisement