Advertisement

[History] When did 24-bit color become a thing in games?

Started by August 26, 2020 01:52 PM
24 comments, last by Borchy 4 years, 3 months ago

I can't speak for the technical capabilities of the cards, but from a business standpoint you would want to manufacture one physical card that could be configured for many different price points. An expensive card build could be “laser-burned” down to card at an inexpensive price-point for the more cost-conscious consumer. Both cards and their components are the same to keep manufacturing costs as low as possible, but the functional configurations are physically different. This could be a reason for what you are observing.

21st Century Moose said:
Vertex and fragment shaders didn't exist back then, not on consumer hardware. The per-vertex stages of the pipeline were done in software on the CPU

Didn't hardware fixed function vertex processing (per vertex matrix transform and lighting?) happen some time before programmable shaders as an intermediate step? I forget what it was marketed as, but recall seeing it sometime before worried about the supported "shader model".

Interesting thinking about it that once we had 24bit colour (or 32bit with alpha for some things within games) was a long time before really progressed beyond that. I guess HL2: Lost Coast was first time I saw HDR rendering (16bit per channel floating point?) and only in last year or so a big thing about HDR displays going past 8bit.

I think I recall a lot of displays actually used 6-bit per channel (18-bit colour) with dithering from the 24bit output from the GPU/games?

Advertisement

SyncViews said:

21st Century Moose said:
Vertex and fragment shaders didn't exist back then, not on consumer hardware. The per-vertex stages of the pipeline were done in software on the CPU

Didn't hardware fixed function vertex processing (per vertex matrix transform and lighting?) happen some time before programmable shaders as an intermediate step? I forget what it was marketed as, but recall seeing it sometime before worried about the supported "shader model".l

It did indeed. The first GeForce was the first consumer hardware with hardware transform and lighting (i.e. vertex processing) and it pre-dated vertex shaders.

It was marketed as a “GPU” and is where the term “GPU” comes from.

Direct3D has need of instancing, but we do not. We have plenty of glVertexAttrib calls.

21st Century Moose said:
It was marketed as a “GPU” and is where the term “GPU” comes from.

Actually it came from the first PlayStation back in 94'.

Borchy said:

21st Century Moose said:
It was marketed as a “GPU” and is where the term “GPU” comes from.

Actually it came from the first PlayStation back in 94'.

I never knew that but a quick check confirms it - interesting indeed!

Direct3D has need of instancing, but we do not. We have plenty of glVertexAttrib calls.

Thank you all for your input, it's been most educational ?

Is there any book I can read about all of this? About the history of the tech behind early GPUs? Specifications are a good source, but all I've been able to find is the Voodoo specifications.

This topic is closed to new replies.

Advertisement