Things are pretty straightforward if you only target a single tier (which, let's be honest, is my case), but I've still been pondering how to go about basic scalability.
Assumption: all textures are encoded with BC (DDS) 1/(3)/5/6/7 or in the future ASTC once that reaches mainstream. The target is the PC market.
Building BC-compressed textures is trivial for 1/3/5, but becomes a strictly offline process for versions 6 and 7. Moreover, while cooking textures for a single API (D3D or OpenGL/Vulkan in this case) is a fixed process, switching between the two requires swizzling blocks in the encoded texture. Again, this is fairly trivial for 1/3/5, but I'm not really aware of any publicly available implementation of how to do it for 6 and 7. In practice this means that the texture needs to be read and re-encoded for which ever API it wasn't cooked for. I'm assuming (sic!) this is also true for ASTC.
The same problem applies to resolution - scaling BC 1/3/5 textures down on the user's machine probably entails a fairly short preprocessing step during first run or installation, but re-encoding a couple of hundred or more BC 6/7 textures will probably end with the game getting uninstalled before it is even run.
So here are the options I can think of:
- target only one API and don't care about supporting both
- or target both APIs and ship all high quality textures for either platform. (or, you known, figure out how to swizzle BC 6/7 blocks). Reorder blocks for BC 1/3/5 for your non-preferred API.
- ship textures in 2-3 sizes (eg 4k, 2k, 1k), turning a blind eye to space requirements
- don't use texture compression for high quality textures
Any thoughts on a semi-automatic "best case" implementation?