Pixel shader, vertex shader, ...
Anyone knows what Vertex shader and Pixel shader are? Is it only available in Direct3D or is it in OpenGL as well?
Also, anyone knows what a scene graph is?
Thanks.
For a good description of both pixel and vertex shaders, have a look at some of the presentation papers at www.nvidia.com/developer
As for openGL, it has its equivalents in vertex programs and (IIRC) texture shaders (which are apparently more versatile than pixel shaders). Theres info about them at the above site as well.
Briefly (and correct me if im wrong), vertex shaders allow you to program your own transformation and lighting pipeline, which then gets carried out in hardware. You write the shaders in assembly style code (but its pretty readable), and then submit it to the api. Every vertex you pass to the api then gets transformed and lit by your own code, but in hardware so you don''t loose any of the speed gained by not using the traditional TnL fixed pipeline. In fact, you could think of the fixed pipleline as a special case of the new programmable one.
As for openGL, it has its equivalents in vertex programs and (IIRC) texture shaders (which are apparently more versatile than pixel shaders). Theres info about them at the above site as well.
Briefly (and correct me if im wrong), vertex shaders allow you to program your own transformation and lighting pipeline, which then gets carried out in hardware. You write the shaders in assembly style code (but its pretty readable), and then submit it to the api. Every vertex you pass to the api then gets transformed and lit by your own code, but in hardware so you don''t loose any of the speed gained by not using the traditional TnL fixed pipeline. In fact, you could think of the fixed pipleline as a special case of the new programmable one.
quote: Original post by Tessellator
As for openGL, it has its equivalents in vertex programs and (IIRC) texture shaders..
.. + register combiners
(can''t do per pixel shading without them )
I discurage people from using those extensions, as only nVidia based cards have those, and it would limit the program to those cards. I would recomend trying to do as much stuff with multitexturing, or other extensions that are more widly supported.
Celeron ][ 566 @ 850256MB PC-100 CAS2 RAMDFI PA-61 Mainboard (250MB memory bandwidth sucks, it should be 500MB)ATI Radeon 32MB DDR LEWindows 98SE
im not 100% sure whats in it but when opengl1.3 comes out (august) im sure theres gonna be some sort of vertex program in it. im not to sure about pixel shaders
http://members.xoom.com/myBollux
http://members.xoom.com/myBollux
quote: Original post by NitroGL
I discurage people from using those extensions, as only nVidia based cards have those, and it would limit the program to those cards. I would recomend trying to do as much stuff with multitexturing, or other extensions that are more widly supported.
You''re saying that because you don''t have a card wich supports all those nice things, I assume?
Keep in mind that a GeForce2/3 MX is going to be the next ''standard'' 3D card..
The majority of people seem to have nVidia cards these days anyhow, and the other major cards out there have crap OpenGL support (3dfx and ati).
You could always make your game only use nVidia extensions if they are supported on the system being run on, else they arent used and the user loses 90% of the eyecandy that other cards wont be able to support due to their general crappiness.
You could always make your game only use nVidia extensions if they are supported on the system being run on, else they arent used and the user loses 90% of the eyecandy that other cards wont be able to support due to their general crappiness.
-----------------------"When I have a problem on an Nvidia, I assume that it is my fault. With anyone else's drivers, I assume it is their fault" - John Carmack
quote:
--------------------------------------------------------------------------------
Original post by NitroGL
I discurage people from using those extensions, as only nVidia based cards have those, and it would limit the program to those cards. I would recomend trying to do as much stuff with multitexturing, or other extensions that are more widly supported.
--------------------------------------------------------------------------------
I''m always getting angry when I''m reading such posts :-).
What''s the point in buying a better videocard then ? So, we should all be (and stay) happy with our Voodoo3 or TNT2 ?
Even game companies these days aren''t using special features of modern cards making all games looks the same. The industry always needs a kick in the ass from one or another company before trying to raise the global level (just wait till Doom3 will be released, you''ll see what I mean).
And it''s perfectly possible to support new features in new engines and having a fallback for older cards. In my new 3D engine I use drivers for generic OpenGL (where I do most stuff with multitexturing) and I''m now working on a GeForce2 driver (where I''ll be using register_combiners, cube mapping, ...) and later there will be a ATI radeon driver (when I find someone who wants to help me and has a radeon :-) and later a GeForce3 driver. At startup it queries the vendor strings of the driver and chooses the most applicable driver (but I can force the use of a certian driver). But these drivers are build in the executable, when I write a new driver I need to rebuild my whole program. This is good enough for a hobby project, but it can be difficult for a company because they always need to update the whole engine. But in Crystal Space they have a plug-in mechanism like COM. So you can write your drivers in a separate DLL and dynamically load it during startup. So when a company is implementing it this way, they just need to release a new DLL when new videocard technology is released.
A scene graph is a hierarchically representation of your scene (a level). Programatically it mostly ends out as a tree structure. In this tree you put all the objects of your scene and also objects known as render state. A render state describes a change of settings in your API (i.e. activating another texture or disabling lightning). The node that has a render state applies this renderstate to all it''s children, so when designed well a scene graph can minimize state changes in API.
The subject of scene graph is much more then this. In my new engine I''m implementing a scene graph, so if you have further questions contact me by e-mail.
--------------------------------------------------------------------------------
Original post by NitroGL
I discurage people from using those extensions, as only nVidia based cards have those, and it would limit the program to those cards. I would recomend trying to do as much stuff with multitexturing, or other extensions that are more widly supported.
--------------------------------------------------------------------------------
I''m always getting angry when I''m reading such posts :-).
What''s the point in buying a better videocard then ? So, we should all be (and stay) happy with our Voodoo3 or TNT2 ?
Even game companies these days aren''t using special features of modern cards making all games looks the same. The industry always needs a kick in the ass from one or another company before trying to raise the global level (just wait till Doom3 will be released, you''ll see what I mean).
And it''s perfectly possible to support new features in new engines and having a fallback for older cards. In my new 3D engine I use drivers for generic OpenGL (where I do most stuff with multitexturing) and I''m now working on a GeForce2 driver (where I''ll be using register_combiners, cube mapping, ...) and later there will be a ATI radeon driver (when I find someone who wants to help me and has a radeon :-) and later a GeForce3 driver. At startup it queries the vendor strings of the driver and chooses the most applicable driver (but I can force the use of a certian driver). But these drivers are build in the executable, when I write a new driver I need to rebuild my whole program. This is good enough for a hobby project, but it can be difficult for a company because they always need to update the whole engine. But in Crystal Space they have a plug-in mechanism like COM. So you can write your drivers in a separate DLL and dynamically load it during startup. So when a company is implementing it this way, they just need to release a new DLL when new videocard technology is released.
A scene graph is a hierarchically representation of your scene (a level). Programatically it mostly ends out as a tree structure. In this tree you put all the objects of your scene and also objects known as render state. A render state describes a change of settings in your API (i.e. activating another texture or disabling lightning). The node that has a render state applies this renderstate to all it''s children, so when designed well a scene graph can minimize state changes in API.
The subject of scene graph is much more then this. In my new engine I''m implementing a scene graph, so if you have further questions contact me by e-mail.
You must do as much as possible to support most of the cards out there (you want to make money don't you ?)
This don't prevent you from supporting specific vendor extension at all, simply don't spend too much time on it since it's not what will make your game sell.
nVidia boards are popular, not so long ago it was 3Dfx boards, don't suppose everything will remain as it is today, they are many cards coming, and some will/already beat down nVidia's boards in feature/price/speed/quality ratio.
Edited by - Ingenu on April 10, 2001 6:45:05 AM
This don't prevent you from supporting specific vendor extension at all, simply don't spend too much time on it since it's not what will make your game sell.
nVidia boards are popular, not so long ago it was 3Dfx boards, don't suppose everything will remain as it is today, they are many cards coming, and some will/already beat down nVidia's boards in feature/price/speed/quality ratio.
Edited by - Ingenu on April 10, 2001 6:45:05 AM
hello there,
Maybe it is true that Nvidia will not always be the big thing. But I think that pixel and vertex shader are there to stay. Now they are only on Nvidia board but ati will surelly implement them too. So if those two big vendors provide them(with microsoft pushing it in directx), the competition will do it too...thus becoming a DeFacto standard.
But for now, personally I would''nt implement them and prefer to concentrate on the gameplay...
What do you think?
Maybe it is true that Nvidia will not always be the big thing. But I think that pixel and vertex shader are there to stay. Now they are only on Nvidia board but ati will surelly implement them too. So if those two big vendors provide them(with microsoft pushing it in directx), the competition will do it too...thus becoming a DeFacto standard.
But for now, personally I would''nt implement them and prefer to concentrate on the gameplay...
What do you think?
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement