Advertisement

DXGI_FORMAT_D16_UNORM to RGB ?

Started by February 27, 2018 01:12 AM
3 comments, last by galop1n 6 years, 11 months ago

Hi guys,

 

I want to visualize textures having the DXGI_FORMAT_D16_UNORM format (like depth, shadow maps, etc.) I'm copying to the clipboard but can't figure out how to get the values.


	// resource is mapped to subresource
	for (int i = 0; i < texture_height; i++) {
	 BYTE* src = (BYTE*)(subresource.pData) + (i * subresource.RowPitch); // start of row
	 for (int j = 0; j < texture_width; j++) {
	  grey = *((unsigned short*)(src)); // this isn't right
	  // fill in my rgb values here
	  src += 2; (advance 2 bytes = 16 bits)
	 }
	}
	

Please help. Thanks.

 

16_UNORM means that the data is stored as unsigned normalized 16-bit integers, where 0 maps to 0.0 and 2^16-1 (65535) maps to 1.0. So what you have there looks fine for reading the raw integer data into your variable "grey". So you just need to divide by 65535.0f to convert that to a floating point value.

Advertisement

Thanks MJP, I wasn't sure I was doing it right.

 

For my depth and shadow maps I store the linear view space depth (normalized), I learned that from your posts. If I were using DXGI_FORMAT_R32_FLOAT should I do it the same way or just store the raw depth values ?

 

Thanks again.

You probably won't see anything with how depth are stored, the usable range of depth is probably between 0.98 and 1 or something similar.

 

Also, you don't have to go up to copying and mapping on the CPU, you can create a R16_UNORM view and draw the depth buffer on screen with a pixel shader, way faster if you don't need to store the result for later usage.

This topic is closed to new replies.

Advertisement