Hi
I have a line segment intersection method which i translated from JavaScript to C#. But it seems to be hit or miss on success rate on whether it detects an intersection of two lines. I don't know why this is.
This is my method:
public static bool LineSegementsIntersect(Vector3 p0, Vector3 p1, Vector3 p2, Vector3 p3, out Vector3 intersection)
{
intersection = Vector3.zero;
float A1 = p1.z - p0.z;
float B1 = p0.x - p1.x;
float C1 = A1 * p0.x + B1 * p0.z;
float A2 = p3.z - p2.z;
float B2 = p2.x - p3.x;
float C2 = A2 * p2.x + B2 * p2.z;
float denominator = A1 * B2 - A2 * B1;
if (Mathf.Approximately(denominator, 0))
{
return false;
}
intersection = new Vector3((B2 * C1 - B1 * C2)/denominator,0f,(A1*C2 - A2*C1)/denominator);
return true;
}
And my test data has:
if (Maths.LineSegementsIntersect(start,end, A.Position, B.Position, out intersectingPoint))
{
Debug.Log(start+":"+end+" intersects "+A.Position+":"+B.Position+ " at: "+intersectingPoint);
return true;
}
And my console gives me true for:
(0.0, 0.0, 0.0):(0.0, 0.0, 1.0) intersects (13.0, 0.0, 9.0):(22.0, 0.0, 9.0) at (0.0, 0.0, 9.0)
This is clearly incorrect. Yet i get true and i have no idea why that is.