Advertisement

Where can I find shader instructions for GeForce2?

Started by February 24, 2003 04:26 PM
27 comments, last by 63616C68h 21 years, 8 months ago
Combiners let you specify certain parts of the shading equation performed for each fragment/pixel. For example:


  			// Bumpmapping DOT3 setup			gl.texEnvf(GL.TEXTURE_ENV, GL.TEXTURE_ENV_MODE, GL.COMBINE_ARB ); 			gl.texEnvf(GL.TEXTURE_ENV, GL.COMBINE_RGB_ARB , GL.DOT3_RGB_ARB );						gl.texEnvf(GL.TEXTURE_ENV, GL.SOURCE0_RGB_ARB , GL.TEXTURE ); 			gl.texEnvf(GL.TEXTURE_ENV, GL.OPERAND0_RGB_ARB, GL.SRC_COLOR ); 						gl.texEnvf(GL.TEXTURE_ENV, GL.SOURCE1_RGB_ARB , GL.PRIMARY_COLOR_ARB ); 			gl.texEnvf(GL.TEXTURE_ENV, GL.OPERAND1_RGB_ARB, GL.SRC_COLOR );  


This is used to perform DOT3 lighting, using a normal map texture and a regular texture (Google DOT3 bumpmapping and you''ll get better explanations). They allow limited programability, but only as far as selecting certain pre-defined modes (like the GL.DOT3_RGB_ARB which performs a dot product using the rgb values as the input vectors). And on a GF2 you''re further restricted because you''ve only got two texture units (stuck using one of them myself )

Vertex programs are separate however, since they''re emulated on GF2 and earlier on the CPU, so you really don''t gain anything you couldn''t do before.
The back of my I/O Magic, nVidia GeForce2 MX 400 graphics card box readsand I quote)
I/OMagic''s GeForce2 MX 400 displays great lighting effects with its complex pixel shading and cube environment mapping.
Because they used the words "pixel" and "shading" together, I''m led to believe that my card DOES infact have programmable capabilities, and NOT just a fixed function T&L/MT pipeline! There must be a rat in the house...
Keep coming back, because it's worth it, if you work it, so work it, you're worth it!
Advertisement
regcoms = register combiners. That's how the pixel pipeline is internally represented in the GF1-GF4 cards. The GF3+ added the texture shader capability, which would make them 'semi-programmable'. Only the Radeon 9700 and the upcomming GFfx are fully programmable (still limited, of course, but they can execute small pixel programs). The GF3 and above are programmable on vertex level. The GF2 has neither.

A regcom is like a router with fixed nodes. It gives you fixed building blocks, that perform certain fixed equations. Then, you have multiple data streams, coming from the rasterizer. A regcom lets you route the data streams to the building blocks in a multitude of ways. But it's still a multiple choice concept, it does not execute a program.

Look, you should really learn more about the internal workings of a 3D card pipeline, before attempting to write a shader. You need a thorough understanding of the pipeline to be able to even write the most simple one. There are lots of documents on nVidia's or ATI's developer sites.

quote:
I/OMagic's GeForce2 MX 400 displays great lighting effects with its complex pixel shading and cube environment mapping.
Because they used the words "pixel" and "shading" together, I'm led to believe that my card DOES infact have programmable capabilities, and NOT just a fixed function T&L/MT pipeline!

Hehe Marketing blahblah. Look into the chipset manufacturer's technical specs, if you want real information.


[edited by - Yann L on February 24, 2003 6:44:58 PM]
Look. I want to do it in NV15 architecture assembly, and nVidia and ATI have no such resources or blueprints. If only there were a professional in the house...

EDIT: And where exactly am I supposed to 'read-up' on my specific graphics pipeline? Essentially, this is what I WILL be doing, but you still haven't given me any addresses to resources YOU've found useful. You sure do sound like an expert. But you also sound like a politician on C-SPAN--a whole lot of talking without saying anything. I need some incite!*&#@*($&UDFSd *pounds keyboard*

[edited by - 63616C68h on February 24, 2003 6:53:08 PM]
Keep coming back, because it's worth it, if you work it, so work it, you're worth it!
*sigh*

The NV15 does not have shader ASM microcode capability. That is why there is no information available...

quote:
But you also sound like a politician on C-SPAN--a whole lot of talking without saying anything.

That is because you don't understand what OrangyTang and I were saying. So I would suggest reading up some information first. Google is your friend. And nVidia's or ATis developer sites even more. Visit them. Select interesting document. Click on it. Read. Understand. Come back here.


[edited by - Yann L on February 24, 2003 6:57:20 PM]
quote: Original post by 63616C68h
Look. I want to do it in NV15 architecture assembly, and nVidia and ATI have no such resources or blueprints.


It cannot be done. Full stop. Except for the limited functionality described above, its all fixed in the hardware. The GPU is not a general purpose chip, it is highly specialised to do a certain set of operations, and do them fast. You can''t program it anymore than you can program a lightswitch.
Advertisement
Bloody hell, you right your own programs for the ALU to execute! I know I cannot "burn" the chip, ok? All I want is to do whatever the hell you were talking about earlier.

EDIT: Let's stop beating this dead horse which no one can do an autopsy on, and let YOU give me some information!

[edited by - 63616C68h on February 24, 2003 7:03:35 PM]
Keep coming back, because it's worth it, if you work it, so work it, you're worth it!
Well, I wish you good luck. Especially with that attitude.
Per pixel lighting. Found via Google, on the nVidia website no less! Shock! Horror!
You''re a wry! If you cannot explain something, then you are the one that doesn''t understand! No one asked you to post, but I believe the FAQ of this forum states that it''s good to post information that''s directly useful to the person in need. That is not to post useless crap, but also not to post meaningless crap that only you and an elite few can understand. Am I in teh right forum? I-I-I believe this is the "For Beginners" forum!
Keep coming back, because it's worth it, if you work it, so work it, you're worth it!

This topic is closed to new replies.

Advertisement