Quake 3 artefact
It`s not a quake 3 only artefact but it`s just an artefact I noticed in several games and demos(in "Mafia"`s smoke was very clear and also a gl text demo from some site I can`t remember now). So I tryed to track the artefact and ''Print Scren''-ing it doesn`t work... so I got a camera and snap it.
Here are 2 pics from quake 3->
http://www.geocities.com/cippyboy_7/quake3_1.jpg
http://www.geocities.com/cippyboy_7/quake3_2.jpg
I suggest you download them and watch them at a best fit(about 50% the pic zie) in 1024x768 with something like ACDSee(v5.0). As you can see there`s something like a small texture with a very stable pattern that looks like a windows file icon(that`s just my explanation of what I can see in there). Don`t try to see the obvious colors, try to see inside the colors...
Anyway If you wanna see this for yourself try quake3 with very low details, because on higher res/detail it dissapears or can`t be seen...
I`m pretty much spoked about this ''artefact'' and I don`t know what could cause such a thing or it may be just a ''quality vs. secondary effect'' issue ?
Relative Games - My apps
December 10, 2003 11:47 AM
Are you sure this isn''t an issue with your monitor? The fact that you can''t capture the artifact with print screen should be a big clue there...
I got this problem too when working on my first idea for the creative contest. It was a dark cave (so, less detail in colors than a light cave). I gave a friend a screenshot and he complaned about seeing almost nothing.
Things you can do are:
Set your brightness level a bit higher.
Set your color intensity higher (i have it at 9100k).
Increase your color depth.
If that doesn''t work, buy a new monitor
.
Things you can do are:
Set your brightness level a bit higher.
Set your color intensity higher (i have it at 9100k).
Increase your color depth.
If that doesn''t work, buy a new monitor

- growl -
I''ve seen this before on:
- Overclocked cards
- 16-bit colour depth rendering on older, crappy cards (think S3, intel intergrated etc.)
- Low colour depth and very low contrast textures & scenes.
- Overclocked cards
- 16-bit colour depth rendering on older, crappy cards (think S3, intel intergrated etc.)
- Low colour depth and very low contrast textures & scenes.
[size="1"][[size="1"]TriangularPixels.com[size="1"]] [[size="1"]Rescue Squad[size="1"]] [[size="1"]Snowman Village[size="1"]] [[size="1"]Growth Spurt[size="1"]]
Well I can say that my monitor is perfectly fine and it`s less than 1 month old, because my last one died
. It`s AOC and I bet that if you set that the low options for everything Quake3 will still generate those things... in dark corners at least.
My Video Card is not that old -> GeForce FX 5200 and I bet that you can see that on any card... it just may be some hardware error but what troubles me is that is has that specific pattern... it would be very stupid for ''John Carmack'' or who knows maybe they`re using something unknown tech that makes that error... just my opinion...

My Video Card is not that old -> GeForce FX 5200 and I bet that you can see that on any card... it just may be some hardware error but what troubles me is that is has that specific pattern... it would be very stupid for ''John Carmack'' or who knows maybe they`re using something unknown tech that makes that error... just my opinion...
Relative Games - My apps
December 11, 2003 11:18 AM
What you are referring to is caused by 16-bit dithering.
And it''s perfectly normal. I think if you saw how 16-bit would look without it, you would prefer the dithering.
And it''s perfectly normal. I think if you saw how 16-bit would look without it, you would prefer the dithering.
Well then can it be enabled/disabled ? I`d like to see it in my proggy and see... the difference

Relative Games - My apps
Apart from changing the display resolution to use a higher colour depth than 16-bit, I would also advise choosing the highest quality image settings in the windows display properties (in the "Performance & Quality" section in the nVidia drivers and something similar on ATI''s drivers). I found that on some cards with "default" or "high performance" settings selected that I got lots of dithering in my LOTR demo (VERY ugly), even in 32-bit mode. I think this has something to do with the driver compressing the textures to improve speed. I just wish that the "high quality" settings were always the default, so that programmers can at least assume that most people will see what we expect to see

Custard Slice: Have you tried specifying your textures with a specific internal format (i.e. GL_RGB8/GL_RGBA8). I notice in your LOTR TTT entry you just use generic internal formats of ''3'' and ''4'' (equivalent to GL_RGB/GL_RGBA). Using more specific internal formats may result in better texture quality (although OpenGL is not forced to use the format you request).
Enigma
Enigma
Enigma: I''ll give that a try, although I remember that NeHe had some problems with the driver settings in his volumetric fog tutorial too. I''m totally fine with weird results if I''m doing something wrong (like using wrong parameters etc.) - I can learn from my mistakes and do it better next time
, but I don''t like the idea of the driver just "taking over" and messing with what the user sees when they run my program, especially if the default settings affect quality.

This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement
Recommended Tutorials
Advertisement