Advertisement

Unity: IRL to in game

Started by April 22, 2018 02:18 PM
2 comments, last by Golden Donkey Productions 6 years, 7 months ago

Hi, I've been working on this issue for a while and haven't yet found an answer.

Does anyone know the best way to convert unity's LAT & LONG into a vector 3 position that I could use in a virtual world (if it's even possible). 

Thankyou in advance :D

First, Unity doesn't use lat/long. You must be using some other library that provides those to you.

Second, how to do that is a science entirely unto itself, because, the earth isn't round, it's ellipsoid, and lat/long has multiple competing definitions.

The easiest thing to do is to assume that you're working in the WGS-84 coordinate system and ellipsoid, and call some canned functions,  that end up mainly doing some division and basic trigonometry (sine/cosine.) The vector 3 position you want is called "cartesian coordinate" or sometimes "geocentric coordinate." (lat/lon is often called "geodetic coordinate.")

However, Unity only works in floats, and the resolution of a floating point value of lat/lon at the Earth's radius, is worse than one meter. Thus, if you try to do rendering or simulation on floats representing geocentric values in 32-bit floats, you will have the worst of all jitter. Thus, I hope your actual values are in double precision. To convert to values you can simulate and render with, you should subtract out some known center position (while still in double precision) before getting to float.

Here's a hit from Google: https://gist.github.com/govert/1b373696c9a27ff4c72a

enum Bool { True, False, FileNotFound };
Advertisement

Thanks a lot that's a great help.

This topic is closed to new replies.

Advertisement