I am working on a game in Unity (using PhysX) that has an input based replay system, which requires determinism to function correctly. I'm aware of the challenges of getting floating point arithmetic to be determ in different settings, so this was initially just built to be an internal tool to help capture video footage. Turned out that the replays are entirely deterministic for same build/different machines. Unfortunately, I get desyncs when a replay captured on one CPU vendor is run on a different one (i.e., AMD ←→ Intel). I've tried doing Mono and IL2CPP (AOT compiling) with no success.
There is of course a large tech stack between Unity and the machine code, but I am wondering if anyone has any insight on this? I made a small app to test arithmetic determinacy
float value = 0.2924150f;
value = (float)Math.Sin(value);
value = (float)Math.Cos(value);
value = (float)Math.Tan(value);
value = (float)Math.Pow(value, 2.32932f);
// numbers.txt contains 200 randomly generated numbers from 0-1.
using (StreamReader file = new StreamReader("numbers.txt"))
{
string line;
int op = 0;
while ((line = file.ReadLine()) != null)
{
float readValue = Convert.ToSingle(line);
if (op == 0)
value += readValue;
else if (op == 1)
value -= readValue;
else if (op == 2)
value *= readValue;
else
value /= readValue;
op = (op + 1) % 4;
}
}
Console.WriteLine(value.ToString("F10"));
byte[] bytes = BitConverter.GetBytes(value);
for (int i = 0; i < bytes.Length; i++)
{
Console.WriteLine(bytes[i]);
}
Console.Read();
and got consistent results on both Intel and AMD, so it could be a configuration issue? From what I've read x86-64 should produce consistent results on modern Intel and AMD, but it's hard to find a straight answer.
Thanks for any help.