Advertisement

Calculate depth of front sphere from back side

Started by August 05, 2017 10:00 PM
5 comments, last by 0r0d 7 years, 6 months ago

Currently sitting on an issue which i can't solve due to my lack of mathematical knowledge.

Here is a picture i made which sums up what i'm trying to do:

depthissue.png

 

To put it simply:

I render spheres (simple 3D meshes) into the scene on a seperate FBO by using frontface culling (so that the backside is rendered. The red part of the sphere on the screenshot.)

Now, in the fragment shader i can access the depth value of the rendered pixel by using "gl_Fragcoord.z". Now what i want to do is to calculate the depth value of the front facing side of the sphere of the exact same pixel. (so that i have a min and max depth value in order to know what the start depth and end depth value of the sphere on the given pixel is.) I need those values for post processing purposes.

 

My attempt to solve this was:

  1.  pass the current vertex position into the fragment shader
  2. subtract the vertex position from the origin point (in view space) to retrieve a normal pointing from the origin to the backface point
  3. Mirror the z-component of this normal (as we are in view space)
  4. add the mirrored normal to the origin point which gives us the front facing (vertex) position of the sphere
  5. use this position to calculate the depth value like in an openGL depth buffer. (haven't done this properly.)

depthissue2.png

 

I may or may not have an error in my shader code. (Maybe the way i multiply matrices is wrong?)

Here is my current code (a bit messy but i tried to comment it.)


//-------------- Vertex Shader --------------------
#version 330

layout (location = 0) in vec3 position;
layout (location = 1) in vec3 normal;
layout (location = 2) in vec4 color;
layout (location = 3) in vec2 uv;
 
uniform mat4 uProjectionMatrix;
uniform mat4 uModelViewMatrix;

out vec4 oColor;
out vec2 vTexcoord;

out vec4 vFragWorldPos;
out vec4 vOriginWorldPos;
out mat4 vProjectionMatrix;

void main()
{
    oColor = color;
	vTexcoord = uv;

	//coordinates are in view space!
	vec4 tFragWorldPos = (uModelViewMatrix * vec4(position,1.0));
	vec4 tOriginWorldPos = (uModelViewMatrix * vec4(0.0,0.0,0.0,1.0));
	

	vFragWorldPos = (uModelViewMatrix * vec4(position,1.0));
	vOriginWorldPos = (uModelViewMatrix * vec4(0.0,0.0,0.0,1.0));
	
    gl_Position = uProjectionMatrix * uModelViewMatrix * vec4(position,1.0);
	
	//send projection matrix to the fragment shader
	vProjectionMatrix = uProjectionMatrix;
}

//---------------- Fragment Shader --------------
#version 330
 
in vec4 oColor;
in vec2 vTexcoord;
out vec4 outputF;
in vec4 gl_FragCoord;


uniform sampler2D sGeometryDepth;
in vec4 gl_FragCoord;

in vec4 vFragWorldPos;
in vec4 vOriginWorldPos;
in mat4 vProjectionMatrix;


 
void main()
{

	//get texture coordinates of the screenspace depthbuffer
	vec2 relativeTexCoord = vec2(gl_FragCoord.x,gl_FragCoord.y);
	relativeTexCoord = relativeTexCoord-0.5+1.0;
	relativeTexCoord.x = relativeTexCoord.x/1280.0;
	relativeTexCoord.y = relativeTexCoord.y/720.0;
	

	//depth
	float backDepth = gl_FragCoord.z;//back depth
	float geometryDepth = texture2D(sGeometryDepth,relativeTexCoord).r;//geometry depth
	
	
	//--------------- Calculation of front depth---------------
	
	//get distance from origin to the fragment (normal in viewspace)
	vec3 offsetNormal = (vFragWorldPos/vFragWorldPos.w).xyz-(vOriginWorldPos/vOriginWorldPos.w).xyz;
	
	//mirror depth normal (z)
	offsetNormal.z*=-1.0;
	
	//add normal to origin point in order to get the mirrored coordinate point of the sphere
	vec4 sphereMirrorPos = vOriginWorldPos;
	sphereMirrorPos.xyz += offsetNormal.xyz;
	
	//apply perspective calculation
	vec4 projectedMirrorPos = vProjectionMatrix * sphereMirrorPos;
	projectedMirrorPos/=projectedMirrorPos.w;
	
	
	
	//TODO: CALCULATE PROPERLY
	float frontDepth = projectedMirrorPos.z * 0.5 + 0.5; // no idea what do do here further

	
	//want to color only the pixels where the scene depth (provided by a screen space texture which is a depthbuffer of a different FBO)
	//is exactly in between the min/max depth values of the sphere
	if(backDepth>geometryDepth && frontDepth<geometryDepth){
		outputF = vec4(1.0,1.0,1.0,1.0);
	}else{
		discard;
	}
	
	
}

 

I suspect that maybe transforming the coordinates into viewspace in the vertex shader (and working with those coordinates) may be an issue. (No idea where/when to divide by "W" for example.) Also i'm currently stuck at the part where i have to calculate the depth values in the same range as the OpenGL depth buffer in order to compare them in the if statement shown at the end of the fragment shader.

Hints/help would be greatly appreciated. 

I nearly got it working.

But there still seems to be a minor error in the calculation of the mirrored depth value.

Here is the current fragment shader (vertex shader is the same as above:)


#version 330
 
in vec4 oColor;
in vec2 vTexcoord;
out vec4 outputF;
in vec4 gl_FragCoord;


uniform sampler2D sGeometryDepth;
in vec4 gl_FragCoord;

in vec4 vFragWorldPos;
in vec4 vOriginWorldPos;
in mat4 vProjectionMatrix;

float linearizeDepth(float depthVal,float zNear,float zFar)
{
  float n = zNear; // camera z near
  float f = zFar; // camera z far
  float z = depthVal;
  return (2.0 * n) / (f + n - z * (f - n));	
}

void main()
{
	float uZnear = 0.01;
	float uZfar = 500.0;


	//get texture coordinates of the screenspace depthbuffer
	vec2 relativeTexCoord = vec2(gl_FragCoord.x,gl_FragCoord.y);
	relativeTexCoord = relativeTexCoord-0.5+1.0;
	relativeTexCoord.x = relativeTexCoord.x/1280.0;
	relativeTexCoord.y = relativeTexCoord.y/720.0;
	
	
	//depth of the backfacing sphere pixels and of the level geometry (depth texture of different FBO)
	float backDepth = linearizeDepth(gl_FragCoord.z,uZnear,uZfar);//back depth
	float geometryDepth = linearizeDepth(texture2D(sGeometryDepth,relativeTexCoord).r,uZnear,uZfar);//geometry depth
	
    //Now we have to calculate the front depth

	//--------------- Calculation of front depth---------------
	
	//get distance from origin to the fragment (in viewspace)
	
	float depthDiff = (vFragWorldPos.z-vOriginWorldPos.z);
	
	//substract depth difference from origin point in order to get the mirrored coordinate point of the sphere
	vec4 sphereMirrorPos = vec4(vFragWorldPos.xy,vOriginWorldPos.z - depthDiff,vOriginWorldPos.w);

	//apply perspective calculation
	vec4 projectedMirrorPos = vProjectionMatrix * sphereMirrorPos;
	projectedMirrorPos/=projectedMirrorPos.w;
	
	
	//depth calculation
	float frontDepth = (projectedMirrorPos.z + 1.0) / 2.0;
	frontDepth = linearizeDepth(frontDepth,uZnear,uZfar);
	

	//want to color only the pixels where the scene depth (provided by a screen space texture which is a depthbuffer of a different FBO)
	//is exactly in between the min/max depth values of the sphere
	
	if(backDepth>geometryDepth && frontDepth<geometryDepth){
		outputF = vec4(1.0,1.0,1.0,1.0);
	}else{
		discard;
	}
	

	
}

 

I believe the issue is somewhere here:


float depthDiff = (vFragWorldPos.z-vOriginWorldPos.z);
	
//add normal to origin point in order to get the mirrored coordinate point of the sphere
vec4 sphereMirrorPos = vec4(vFragWorldPos.xy,vOriginWorldPos.z - depthDiff,vOriginWorldPos.w);

"vFragWorldPos" is the position of the current vertex in ModelviewSpace. "vOriginWorldPos" is the origin of the sphere in modelviewSpace.

I simply calculate the z difference of both points by substracting the z components of both vectors.

Then i reconstruct the mirrored vertex coordinate by using the xy coordiantes of "vFragWorldPos" while the z-coordinate is calculated by substracting the depthDifference from the origin z-coordinate.

 

The issue is that it doesn't seem to give correct results by doing so.

I tested if this reconstruction method by changing this line:


vec4 sphereMirrorPos = vec4(vFragWorldPos.xy,vOriginWorldPos.z - depthDiff,vOriginWorldPos.w);

to this:


vec4 sphereMirrorPos = vec4(vFragWorldPos.xy,vOriginWorldPos.z + depthDiff,vOriginWorldPos.w);

which effectively calculates the depth of the back side of the sphere which i then compared with the values of the depth buffer. They are exactly the same. (which is correct.) But substracting the depthDiff value doesn't yield correct results.

 

Is there something that i'm missing? Maybe the z coordinates of the vertices which were transformed to modelview space aren't linear?

 

Advertisement

This sounds like an XY problem to me. Try backing up a step or two and describe what you're trying to accomplish. I suspect someone on these forums will be able to suggest a different approach that might work better.

I only looked over the code quickly, so apologies if I'm misunderstanding something.  But, assuming your description of how you're trying to go about solving this, and that you're doing all the math in view space as you said... then it wont work because negating the z value wont give you what you think it gives you.  It wont give you a point backwards along the line of sight.

You can solve this by getting the point of intersection between the line of sight to the pixel and the line from the sphere origin that intersects that line at a right angle.  Once you get this point (lets call it midPoint) you're basically home free as you can just use the pixel position and midPoint to get the point your looking for.

Here's some pseudo code:


vec3 pixelPos;	// position in view space of the pixel on the sphere back face.  Known.
vec3 spherePos;	// position in view space of sphere origin.  Known.

vec3 tempDir = CrossProduct(spherePos, pixelPos);	// vector pointing up/down from plane
tempDir = CrossProduct(pixelPos, tempDir);		// vector pointing from line-of-sight towards spherePos
tempDir = Normalize(tempDir);

float d = DotProduct(tempDir, spherePos);	// distance along the "right" vector towards the sphere origin

vec3 midPoint = spherePos - tempDir * d;	// midPoint = point midway from front to backface along ling-of-sight

vec3 frontFacePos = 2 * midPoint - pixelPos;	// the frontface view-space point you want

Of course this code doesnt check to see if pixelPos and spherePos are parallel.  You'll need to check for that and handle the situation accordingly.  But, I think this will give you what you want... a point backwards along the line of sight (in view-space) from the backface of the sphere towards the frontface.

On 7.08.2017 at 11:51 AM, 0r0d said:

I only looked over the code quickly, so apologies if I'm misunderstanding something.  But, assuming your description of how you're trying to go about solving this, and that you're doing all the math in view space as you said... then it wont work because negating the z value wont give you what you think it gives you.  It wont give you a point backwards along the line of sight.

You can solve this by getting the point of intersection between the line of sight to the pixel and the line from the sphere origin that intersects that line at a right angle.  Once you get this point (lets call it midPoint) you're basically home free as you can just use the pixel position and midPoint to get the point your looking for.

Here's some pseudo code:



vec3 pixelPos;	// position in view space of the pixel on the sphere back face.  Known.
vec3 spherePos;	// position in view space of sphere origin.  Known.

vec3 tempDir = CrossProduct(spherePos, pixelPos);	// vector pointing up/down from plane
tempDir = CrossProduct(pixelPos, tempDir);		// vector pointing from line-of-sight towards spherePos
tempDir = Normalize(tempDir);

float d = DotProduct(tempDir, spherePos);	// distance along the "right" vector towards the sphere origin

vec3 midPoint = spherePos - tempDir * d;	// midPoint = point midway from front to backface along ling-of-sight

vec3 frontFacePos = 2 * midPoint - pixelPos;	// the frontface view-space point you want

Of course this code doesnt check to see if pixelPos and spherePos are parallel.  You'll need to check for that and handle the situation accordingly.  But, I think this will give you what you want... a point backwards along the line of sight (in view-space) from the backface of the sphere towards the frontface.

That is EXACTLY what i needed/what i was looking for. Thank you!

Although i'm having a hard time understanding why this formula works. (As i seem to misunderstand how the viewspace works.)

Quote

Of course this code doesnt check to see if pixelPos and spherePos are parallel.

What exactly do you mean with "parallel"? If they are axis aligned in view space?

6 hours ago, Lewa said:

That is EXACTLY what i needed/what i was looking for. Thank you!

Although i'm having a hard time understanding why this formula works. (As i seem to misunderstand how the viewspace works.)

What exactly do you mean with "parallel"? If they are axis aligned in view space?

Here's a diagram that might help:

Untitled-1.thumb.jpg.02ffe96245f5107f535fe1c5c9e6e9a2.jpg

I think the problem is that you're thinking that view space means the Z values point along the line of sight to the camera.  But, view space is just camera space.  So when you take the vector (=> pixelPos - spherePos) and then negate the Z component, you get the "incorrect midPoint" seen above.  

So what you need is to find the correct midPoint by first finding the "tempDir" in the image, which is found by first finding the vector normal to the plane and then doing a cross product to find this new vector which is orthogonal to both the plane normal and the line of sight vector.  Once you have that vector you easily get the midPoint and then easily the frontFacePos.

Does that make sense?

As far as why it matters to check if spherePos and pixelPos are parallel... if they are parallel (ie they both lie on the line from camera to spherePos) then the first 2 cross products will give you a 0 length vector, and then the normalize operation will cause a divide by 0.  So, you need to check if the pixelPos is parallel to spherePos, in which case the frontFacePos would just be


frontFacePos = 2 * spherePos - pixelPos;

 

This topic is closed to new replies.

Advertisement