Advertisement

Help with loading a couple of DDS textures as 1 large texture in memory

Started by December 24, 2017 08:52 PM
6 comments, last by yonisi 7 years, 1 month ago

Hi everyone,

I have a terrain grid which is pretty large and I'd like to have some data stored in textures to map stuff for me. For example I'm going to have water bodies data which is made of 64 DDS textures where each texture is 4096x4096. Now, since I don't want to hold all 64 texture in memory all the time - Basically I can, but VRAM will probably be crowded with other stuff and I want to save room. Also there are some other sets that I'll need to hold which may be even larger.

So, I decided that at a given moment I'd like to hold a cube of 3x3 of those 64 textures in VRAM and use the camera position to decide at which cube I'm right now and render those 3x3 cubes accordingly. Now, I have 2 ways that I thought of doing what I need:

1. Use a texture array - I already know how to use this and it would be pretty easy to manage I think. But My fear is that I won't have a way to decide which array index will fit each pixel WITHOUT USING a couple of if/else pairs for dynamic branching in the Pixel shader which AFAIK isn't such a good idea. Please if you think that using a couple of dynamic branches isn't THAT bad, then I may do just that, it would be easier for me.

2. Use a texture Atlas in memory - This solution has the advantage that I can directly translate world position in the Pixel shader to texture coordinates and sample, but I'm not sure how to load 3x3 DDS textures into 1 big Atlas that is 3x3 times the size of each of the textures. I'm especially confused with how to order the textures correctly in the Atlas, as I'm not sure it'll be ordered same way as loading into an Array.

If option #2 is doable I think to go with that would be easier than translating world position to array indices.

Thanx for any help.

4 hours ago, yonisi said:

DDS textures where each texture is 4096x4096. Now, since I don't want to hold all 64

Because you are using 4K textures you can no longer use a atlas. 4K is the max texture size that is guarantied to work on low, mid and high range desktops.

A atlas isn't something special. 4 1024*1024 images are set alongside each other inside a 2048*2048 image; like a sprite sheet for textures.

So pixel (0,0) to (1024,1024) would be one texture in the atlas. (1024,0) to (2048,1024) a other. You just put the images side by side.

 

This is also why you can't use a atlas. Your atlas would be a 8192*8192 so that you can store 3 4K textures. Meaning only monster computers could load it safely. No use on consoles,mobiles and most desktops. You lose 7/8 of your players.

Even AAA games don't use 4K textures for terrain, instead they use tiled textures and blending to get the final look.

Try the arrays and if it doesn't work you should try texture tileing.

Advertisement

A few branches to compute a texture array index really doesn't sound like a big deal to me. If you're dealing with atlasing of textures that are all the same size, then texture arrays are definitely the easiest way to do it. This is especially true when it comes to mipmaps (which you'll want for terrain textures), since texture arrays keep mips separate and therefore let you avoid the "bleeding" problems that you run into with traditional atlases. There are some caveats when it comes to dynamically updating an atlas at runtime from the CPU, which I can elaborate on if you can tell me which API you're planning on using for this.

If you're curious or you'd like to expand your atlas approach into something more generalized, you may want to good for some articles or presentations about virtual texturing. Virtual texturing is really a generalization of what you're proposing, and has been effectively used for terrain in games with large worlds (like the Battlefield series, or the Far Cry series). The typical approach that they use for the "figure out where to sample a pixel's texture from" is to have an indirection texture that's sampled first. So for instance, you might have a "virtual texture" that's 32k x 32k texels that represents all textures that could ever be referenced, but you only keep an 8k x 8k atlas of textures loaded. You would first sample the indirection texture to see where the virtual texture page is loaded into the atlas, and that would give you UV coordinates to use when sampling the atlas. So if your "page" size is 32x32, then your indirection texture would only need to be 1k x 1k. In practice it gets pretty complicated with mip mapping, since each mip will typically be packed separately in the atlas, which requires manual mip sampling + filtering in the pixel shader. There's also somewhat-recent hardware + API support for virtual textures, called "Tiled Resources" in D3D and "Sparse Textures" in GL/Vulkan. If you use that you can potentially skip the indirection texture and also remove the need for manual mip/anisotropic filtering in the pixel shader, but your virtual texture still has to respect the API limits (16k max in D3D).


 

1 hour ago, Scouting Ninja said:

Because you are using 4K textures you can no longer use a atlas. 4K is the max texture size that is guarantied to work on low, mid and high range desktops.

A atlas isn't something special. 4 1024*1024 images are set alongside each other inside a 2048*2048 image; like a sprite sheet for textures.

So pixel (0,0) to (1024,1024) would be one texture in the atlas. (1024,0) to (2048,1024) a other. You just put the images side by side.

 

This is also why you can't use a atlas. Your atlas would be a 8192*8192 so that you can store 3 4K textures. Meaning only monster computers could load it safely. No use on consoles,mobiles and most desktops. You lose 7/8 of your players.

Even AAA games don't use 4K textures for terrain, instead they use tiled textures and blending to get the final look.

Try the arrays and if it doesn't work you should try texture tileing.

D3D10-level hardware guarantees support for 8k textures, and D3D11-level hardware guarantees support for 16k textures. 

2 hours ago, MJP said:

D3D10-level hardware guarantees support for 8k textures

 feel that I should clearup something.

Most dds users store there MIPS in a spiral right in the texture. So  4K*4K turns into a 8K*4K so also a 8k*8K goes to 16K*8K.

This I why I assume that atlasing would result in the need for D3D11 level hardware. Meaning that only 1/8 people will have a computer that can play it.

Ofcours if the mipmaps is kept separated, then I am wrong.

 

Still there was this huge race for 4K textures and now most AAA games are actually using smaller and smaller textures, with only characters using the large ones.

4 hours ago, Scouting Ninja said:

Because you are using 4K textures you can no longer use a atlas. 4K is the max texture size that is guarantied to work on low, mid and high range desktops.

A atlas isn't something special. 4 1024*1024 images are set alongside each other inside a 2048*2048 image; like a sprite sheet for textures.

So pixel (0,0) to (1024,1024) would be one texture in the atlas. (1024,0) to (2048,1024) a other. You just put the images side by side.

 

This is also why you can't use a atlas. Your atlas would be a 8192*8192 so that you can store 3 4K textures. Meaning only monster computers could load it safely. No use on consoles,mobiles and most desktops. You lose 7/8 of your players.

Even AAA games don't use 4K textures for terrain, instead they use tiled textures and blending to get the final look.

Try the arrays and if it doesn't work you should try texture tileing.

Edit: Already mentioned, I was a bit late and MJPs post didn't show up.

Ummm... No.  Doom 2016 uses 16kx16k megatexture (made up of smaller than 4k x 4k textures but that doesn't really matter, they do this even on consoles.  Even my mid range test machines support this resolution.  Doom also sold well so it seems to be okay.  You might lose some low end PCs but honestly they're a support nightmare to deal with too.

"Those who would give up essential liberty to purchase a little temporary safety deserve neither liberty nor safety." --Benjamin Franklin

Update: OpenGL 4.1 mandates support for 163842 and I cannot see Vulkan being less than that but do not quote me on that :)  OpenGL 4.1 is fairly old at this point (4.6 is current) and 4.1 is even supported on MacOS.

"Those who would give up essential liberty to purchase a little temporary safety deserve neither liberty nor safety." --Benjamin Franklin

Advertisement

Thanx everyone for the answers!

MJP - I'm going to go with your advice then and use Texture arrays. I learned DX11 coding from Frank Luna Book and he always warned about dynamic branching in shaders, but since I already saw some in code that is working for years without too much performance issues, I guess it won't be such a big deal.

FWIW - I'm working on this engine for DX11 and PC only (As eventually this engine should be implemented on an existing game engine going through an upgrade). The specific textures I'm referring to here actually won't use any mipmapping because those are textures that will hold mapping data, and not art (e.g watermaps will hold 8-bit color values that will be used as depth maps and some other data related to water), I also plan to use the same mechanics for my blendmaps. Currently I have 8192x8192 textures that hold texture IDs and alpha values that are used in the Pixel shader to select which texture IDs to blend with which alpha for each, the result is eventually a Multitexture operation.

Regarding textures size and limits and performance, I'm ALREADY using in current version a couple of blend maps that are all 8192x8192 and I didn't noticed any performance issues, not even with my previous laptop that had GTX-750 mobile card. This engine isn't for any kind of console or mobile devices, it's for PC only and most users using the current version of the game already have descent Hardware, so I don't think there will be any issues with using 4096x4096 size.

This topic is closed to new replies.

Advertisement