Low-end hardware and fullscreen blur (?)
In short; Is there any downright dirty hack to perform a full scene blur without the latest fancy hardware (and aid of accum. buffers)? And I do mean OLD hardware, as I'm currently targeting my project to work in it's full glory even on ATI's old Mobility-models (hard, yet not impossible task).
Oh; It doesn't actually need to be that fast, as I'm planning to use it for menus (game remains in BG as blurred).
Thanks!
there is, but it is as you said downright dirty.
I's something that the old demo groups used about 15 years ago, overwriting.
Basicaly what you do is
1. render the object only to the z buffer.
2. render the object normaly but with 0.1 alpha or something like that
3. clear the z-buffer
4. do 1-3 over and over again, but with increasing alpha while moving it slightly.
now this effect doesn't work all the time and might look somewhat odd on places, like if you render in a background every time.
Try messing with some of the blending values and so on for a better result.
But it is a downright dirty hack.
I's something that the old demo groups used about 15 years ago, overwriting.
Basicaly what you do is
1. render the object only to the z buffer.
2. render the object normaly but with 0.1 alpha or something like that
3. clear the z-buffer
4. do 1-3 over and over again, but with increasing alpha while moving it slightly.
now this effect doesn't work all the time and might look somewhat odd on places, like if you render in a background every time.
Try messing with some of the blending values and so on for a better result.
But it is a downright dirty hack.
www.flashbang.se | www.thegeekstate.com | nehe.gamedev.net | glAux fix for lesson 6 | [twitter]thegeekstate[/twitter]
You could try capturing the frame buffer to a texture, bluring the frame buffer texture and then completely rerendering the scene with the new blured texture (make sure to disable your depth test when you do this). You should only notice a slight delay when you pause to go to menus and the complexity of the algorithm is independent of your scene's complexity (always on the order of your resolution, which on old hardware shouldnt be that big).
I've believe I've seen some old apps faking motion blur by just blending over the previous frame(s).
How they actually did it without using RTT is something unclear to me... maybe they used to copy the backbuffer to a texture? This seems unlikely, unless they were tiling textures to reach the desired resolution. This is another thing I hardly believe.
Anyway, those are the few hints I can give you. I understand it's not much but it's something.
PS: that way a cheap blur would be to just set texture lod bias so a whole mip level is cut away.
How they actually did it without using RTT is something unclear to me... maybe they used to copy the backbuffer to a texture? This seems unlikely, unless they were tiling textures to reach the desired resolution. This is another thing I hardly believe.
Anyway, those are the few hints I can give you. I understand it's not much but it's something.
PS: that way a cheap blur would be to just set texture lod bias so a whole mip level is cut away.
Previously "Krohm"
I took the initiative and implemented my suggestion:
www.eng.uwaterloo.ca/~rramraj/blur.zip
I basically took lesson 8 and added the blur code to it. I developed it in debian linux using SDL but it should be directly portable to windows (edit the make file). Unfortunately, the blur algorithm provided makes a relatively weak blur (perhaps I should have used a more complex scene), but you should be able to directly modify the algorithm. This algorithm should work on most if not all implementations of opengl because it uses fairly basic calls (get pixels and texture generation).
Hope this helps,
- llvllatrix
www.eng.uwaterloo.ca/~rramraj/blur.zip
I basically took lesson 8 and added the blur code to it. I developed it in debian linux using SDL but it should be directly portable to windows (edit the make file). Unfortunately, the blur algorithm provided makes a relatively weak blur (perhaps I should have used a more complex scene), but you should be able to directly modify the algorithm. This algorithm should work on most if not all implementations of opengl because it uses fairly basic calls (get pixels and texture generation).
Hope this helps,
- llvllatrix
You get a much cleaner blur if you include the diagonals when averaging the pixel's data:
// Grabs the blured version of the pixelGLubyte get_blured_pixel(GLubyte * screen_buf, int ii, int jj, int color){ GLubyte mid = screen_buf[get_pixel_index(ii,jj,color)]; GLubyte up = screen_buf[get_pixel_index(ii,jj+1,color)]; GLubyte down = screen_buf[get_pixel_index(ii,jj-1,color)]; GLubyte left = screen_buf[get_pixel_index(ii-1,jj,color)]; GLubyte right = screen_buf[get_pixel_index(ii+1,jj,color)]; GLubyte upleft = screen_buf[get_pixel_index(ii-1,jj+1,color)]; GLubyte upright = screen_buf[get_pixel_index(ii+1,jj+1,color)]; GLubyte downleft = screen_buf[get_pixel_index(ii-1,jj-1,color)]; GLubyte downright = screen_buf[get_pixel_index(ii+1,jj-1,color)]; return (mid + up + down + left + right + upleft + upright + downleft + downright)/9;}
Yeah, that's exactly the kind of an effect I was looking for! Thanks a lot for the code!
I came up with a new algorithm using an approach along the lines of lc_overlord suggestion. Instead of re-rendering the scene, I saved it to a texture and redrew the texture over and over again in a staggered manner similar to that of the gaussian blur presented in the first algorithm, with the appropriate alpha value. The resulting algorithm is real time on my card (Geforce 6200). Here is the source:
www.eng.uwaterloo.ca/~rramraj/blur.zip
Cheers,
- llvllatrix
www.eng.uwaterloo.ca/~rramraj/blur.zip
Cheers,
- llvllatrix
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement