Ok this might be tricky to explain... I'm trying to orient the data from a magnetometer. Essentially, the magenetometer 'portrays' a vector pointing toward magnetic north and down into the Earth at an angle of roughly 60 degrees (from horizontal). Kind of like this:
\ y
\ |
\
-----x------- ->z
\
\
v
So when you're holding the sensor, you can imagine that values are 1.0 where an axis is in line with this vector (or -1.0 if it's facing away from this vector) and 0.0 when the axis is perpendicular to the 'flow' of the magnetic vector.
If I lay my sensor flat on the table with the y axis pointing directly at north, I get these values:
X = 0, Y = - 90, Z = 44. This is fine, you can imagine this in the picture above (the point 'v' is roughly this position).
If I now spin it 180 degrees on the table, I get these values:
X = 0, Y = -90, Z = -44. This is also fine, all that has happened is that in the sensor's coordinate space, the vector has flipped around. All good so far.
So what I'd like to do is apply a rotation matrix to these values which brings the vector into the vertical position. So when you spin the sensor around, the X and Z values change, but the Y value stays at 1.0. This will help with further calculations I need to do (I basically need to find the heading regardless of pitch/roll, etc).
I thought I could achieve this by creating a pitch-rotated matrix (by 60 degrees), get the inverse of that and transform my readings by that matrix and that does work but only in one orientation. If I spin the sensor around 180 degrees it's almost like the 'v' shape the two magnetic vectors have formed, has been rotated so that one of the vectors is up but the other is further toward the horizontal.
What's the correct maths for this? It's been a long time since I did any rotation stuff. I'm using DirectX and using their RotationYawPitchRoll method to get my matrix out.