Ok, refined my measurement, refined my interpolation code, still getting some bogus resoults 
This is the current Interpolation code:
float GameTimer<TickPerSec>::Interpolation()
{
time_point<steady_clock> Now = steady_clock::now();
nanoseconds InterpolationRange = NextUpdateTime - PrevUpdateTime;
nanoseconds InterpolationPosition = Now - PrevUpdateTime;
float Interpolation = (static_cast<float>(InterpolationPosition.count()) / static_cast<float>(InterpolationRange.count()));
return Interpolation;
}
In the image below (1000ms, 25 updates) all the numbers express milliseconds, the colors have the following meanings:
Red: next game state update (at steps of 40ms)
Orange: when we actually execute the game state update (can vary depending on Vsync or rendering time I guess)
Blue: A Sample of Interpolation value calculated for that specific point between the last update(orange) and the next update(red)
Green: millisecond corresponding to when the blue interpolation value above it was recorded

Most value would look fine, but some are suspicious which tells me my math is wrong.
The suspicious one are for example (I refer to the green numbers) the sample at 492ms, 892ms, especially 609ms which is 3ms past the last update with an interpolation value of 0.000... that can't be
475ms looks extremely wrong as well, is almost touching the red update in front of it and has a value of 0.84...nonsense =_=
If anyone spot the error in my math, please let me know(well, can't totally exclude it's an error in my picture, but still, I think is the interpolation math) 