Advertisement

How do you fade in/out a texture?

Started by March 23, 2003 09:03 PM
7 comments, last by Xiachunyi 20 years, 11 months ago
Hello everyone, I was wondering if there was a simple way to fade in/out a texture so you can see an object gradually appear. Does it have a variation with GL_LIGHTING or GL_BLEND? I have an idea on how to do it, but that involves loading lots of textures that were gamma corrected and cycling through them.
glColor4ub(Red,Green,Blue,HERE!);

I noticed that as HERE! nears 0 it gets darker and darker
as it nears it gets brighter and brighter

Just make it a Variable and increase it when you want it to fade in.
Advertisement
Not really sure it works.
What you're doing is multiplying FRAGCOLOR = VERTEXCOLOR * TEXTURECOLOR using gl's textureenv.

Too bad if textureenv is GL_REPLACE since this won't work (it will be FRAGCOLOR = TEXTURECOLOR).
I suggest to explain what you're doing - maybe someone may not get waht's really happening.

It gets darker when ALPHA approaches .0? Not sure about this - if the background is white it will be significantly brighter (provided blending is on don't know what will happen if it is off).

Well, the idea is the best I can actually think, I just wanted to point out those things.


Bye

[edited by - Krohm on March 24, 2003 7:19:00 AM]

Previously "Krohm"

I got it to work, sort of, but it is better than anything:

My "dirty" code with implementation from Kevin Harris:


    void startscene(){    glDisable(GL_LIGHTING);    glEnable(GL_DEPTH_TEST); // enable the z-buffer    glEnable(GL_TEXTURE_2D); // Enable Texture Mapping    glShadeModel(GL_SMOOTH); // smooth shading between vertices (vertex coloring)    glClearColor(0.0, 0.0, 0.0, 0.0); // clear color is black    glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);    glMatrixMode(GL_MODELVIEW);    glLoadIdentity();    glTranslatef(0.0, 0.0, -5.0);    glRotatef( g_heading, 0.0, 1.0, 0.0 ); // rotate about the y-axis    glRotatef( g_pitch, -1.0, 0.0, 0.0 );   // rotate about the x-axis    g_heading += 0.5f;        // Assign texture    glBindTexture(GL_TEXTURE_2D, texture[12]);    glColor4d(1.0f,drop_color,drop_color,fade);    // Render a sphere with texture coordinates    RenderSphere( 0.0f, 0.0f, 0.0f, 1.5f, g_nResolution);				        glFinish();        //make sure everything has been sent    SwapBuffers(hDC);       }  The 'once-again' counting code.       int DrawGLScene()						{  glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);//The starting screen  if(screen == -1)  {    if(fade < 0.9)    {       glEnable(GL_BLEND);    }    if(fade > 0.9)    {       glDisable(GL_BLEND);    }    fade=fade+0.0005f;      startscene(); //Start scene is a function so I won't have to go through the stuff once I'm done.    if(fade > 1.0)    {       drop_color=drop_color-0.001f;    }    if(drop_color < 0.0f)    {         screen=0;    }        }....    


Edit: Thankyou very much!

[edited by - Xiachunyi on March 24, 2003 9:39:55 PM]
You''re forgetting to make that code time based, Right now it''ll fade at a different speed on my computer than your computer, and everyone elses computer.
I have that timing thing via the while tick command embedded within the cycle code:

float start=TimerGetTime();
...
while(TimerGetTime()// Waste Cycles On Fast Systems
Advertisement
I was thinking something along the lines of

fade+=0.0005f*frametime;

where frametime is the time between frames...
Ahhh, I see now, thanks.
howcome the glColor4d(...) affects other viewports under it

This topic is closed to new replies.

Advertisement