Advertisement

Hi masters : Per Pixel Lightning problem

Started by September 21, 2002 05:11 AM
22 comments, last by Leyder Dylan 22 years, 5 months ago
Sorry to take the message from another forum but I don''t know how can I insert some source code. Here''s my message : http://www.flipcode.com/cgi-bin/msg.cgi?showThread=00006016&forum=3dtheory&id=-1 Thanks for you answers .. ======================== Leyder Dylan http://ibelgique.ifrance.com/Slug-Production/
========================Leyder Dylan (dylan.leyder@slug-production.be.tf http://users.skynet.be/fa550206/Slug-Production/Index.htm/
Register combiners take quite a while to learn how to write properly, so I suggest you just keep at it and dig deeper into nvidias white paper collection. You can do some pretty nice things with them, although I''m now an Ati convert, well, they arn''t that great.

<-- smile :-)
Advertisement
If you read the FAQ, you will find that source code can be simply insterted thanks to the [source] and [/source] tags.

As for register combiners, yes it takes "quite a while to learn how to write properly", and yes it''s not portable. And with fragment programs being approved by the ARB recently, the register combiners will probably become less and less popular.
vincoof.. if there is no GL_NV_text_register_combiners or GL_NV_text_pixel_shaders is comming (i suggest both, one for gf1/gf2, one for gf3/gf4), register combiners will still be there for quite a while, just because the cards are there as well.. GL_ARB_fragment_program is for the radeon9700 and nv30..

"take a look around" - limp bizkit
www.google.com
If that's not the help you're after then you're going to have to explain the problem better than what you have. - joanusdmentia

My Page davepermen.net | My Music on Bandcamp and on Soundcloud

I''ve prepared a light version of my engine, so if you want, you can try it with the full source code.

Use

F ==> Moving the light toward Z
V ==> Moving the light to you
D ==> Moving the light left
G ==> Moving the light right

You can download the source code here :

http://ibelgique.ifrance.com/Slug-Production/Slug-3D-V-0.6-Light.zip


========================
Leyder Dylan
http://ibelgique.ifrance.com/Slug-Production/
========================Leyder Dylan (dylan.leyder@slug-production.be.tf http://users.skynet.be/fa550206/Slug-Production/Index.htm/
Oups, here''s the correct link :


http://ibelgique.ifrance.com/Slug-Production/Slug3D-V-0.6-Light.zip

A screenshot of my problem :

http://ibelgique.ifrance.com/Slug-Production/Images/Screenshot_0.jpg





========================
Leyder Dylan
http://ibelgique.ifrance.com/Slug-Production/
========================Leyder Dylan (dylan.leyder@slug-production.be.tf http://users.skynet.be/fa550206/Slug-Production/Index.htm/
Advertisement
Thinking about the reply on flipcode, I also think it''s a matter of depth buffer.
I can''t compile since I don''t have the DX DSK, so I can''t test the idea, but it seems really logical.

BTW, it''s "per-pixel lighting", not "per-pixel lightning".
Lighting(EN) = Eclairage(FR) -- Effet rayonnant émis à partir de sources lumineuses
Lightning(EN) = Eclair(FR) -- Phénomène éléctrique dégagé par la foudre
davepermen: you''re right that register combiners will always be useful for GeForce-based cards (at least, up to GF4) but what I was trying to point out is that it''s becoming an obsolete feature, and it would be wise not to spend too much effort on it.

Leyder Dylan: I also think that you should NOT enable blending for the first pass, because your objects are NOT transparent. Call glDisable(GL_BLEND) or replace glBlendFunc(GL_ONE, GL_ONE) by glBlendFunc(GL_ONE, GL_ZERO). Personaly I prefer calling glDisable(GL_BLEND) instead (and don''t forget to enable it later) because it''s more clear and has better performance.
well, gf2, gf3 and gf4 will survive for quite a while so the support still have to be there, and the new nvidia cards still have register combiners after the fragment program (yes!)..

but in one point you''re right. they are obsolent. namely the nvidia cards. very bad hw design.. will die out sooner or later, i think (i hope soon)

"take a look around" - limp bizkit
www.google.com
If that's not the help you're after then you're going to have to explain the problem better than what you have. - joanusdmentia

My Page davepermen.net | My Music on Bandcamp and on Soundcloud

I wouldn''t want to run the discussion off topic, so we''d better start another thread to go further, but all I can say is I agree that the GeForce''s are far from being perfect and I disagree about the fact that they "have very bad hw design"

This topic is closed to new replies.

Advertisement