Thaumaturge said: In all fairness, my snippet above doesn't use a velocity-based approach, so there should be no overshooting the target--at least as far as I see.
That's why i said ‘if there were a physics simulation’. Ofc we expect to see physically plausible behavior even if no simulation backs it, so keeping track of velocity might help if you can't solve it otherwise.
I do hope that it doesn't come to such a thing, however. ^^;
I could share the controller code - it's not much of it. But it's years since i made and used this, so would probably fail to explain how it works, and i have to see what's the most recent version i did. IIRC, input is current position and velocity, target position, and output is new velocity and time till we hit the target. The bound is a given constant acceleration, which defines how quickly the camera can change it's movement. Resulting motion is very natural. But i have no bound on max velocity in this controller. Should work to just clip the output, but surely some work for you without knowing the result will be like what you want.
JoeJ said: That's why i said ‘if there were a physics simulation’.
Ah, fair--my apologies, then!
JoeJ said: Ofc we expect to see physically plausible behavior even if no simulation backs it, so keeping track of velocity might help if you can't solve it otherwise.
Alas, I did try a velocity-based approach, to similar effect: When the camera was overly loose, all was pleasantly smooth; when the camera was kept as close to the target as intended, jitter appeared.
(As I recall, I added the current offset between target and camera, multiplied by delta-time and a scalar, to the current velocity. I further applied a friction force (without which the camera oscillated endlessly, of course). This velocity was then applied to the camera's current position, multiplied by delta-time.)
JoeJ said: I could share the controller code - it's not much of it. But it's years since i made and used this, so would probably fail to explain how it works, and i have to see what's the most recent version i did. IIRC …
It's tempting, I will confess.
But I haven't yet tried Gnollrunner's suggestion (hopefully tomorrow!)--we'll see if that doesn't perhaps do to the job!
I suppose that at this point one more question might be: is there something that I might be doing somewhere in my code that might be resulting in otherwise-sound approaches failing? Some oversight in how I handle… target-velocity, or camera-placement, or… something?
Just to check that there isn't some problem outside of the current approach to camera-following that might be undermining all attempts.
I would like to see dt and position values and camera player positions in a log with a mark where jitter starts, however issue may be with newtonian motion instead you want to use range kutta 4th movement
We can only speculate on what you reply to us so we can't confirm anything
What struggels me mostly is how the camera follows the object, code seems like have realtime follow with no stack that is followed, fo me updating camera should be unattached, in ex put in a thread following changes of object position. I can clearly see a misunderstanding. For now it seems it's strictly attached to object position, so maybe object jitters not the camera, however….too less details to answer anything
_WeirdCat_ said: I would like to see dt and position values and camera player positions in a log with a mark where jitter starts …
JoeJ said: A video would help as well if possible.
Sure--if experiments based on one or more suggestions given earlier in the thread don't work out, I'll try to remember to post those things. (It's rather late here, and I'm not working on the project today.) I will note that dt-values have been fairly stable when I've printed them out, as I recall.
_WeirdCat_ said: … however issue may be with newtonian motion instead you want to use range kutta 4th movement
I'll confess that I don't think that I've ever heard of “kutta 4th movement”. I may look that up, thank you!
_WeirdCat_ said: We can only speculate on what you reply to us so we can't confirm anything
Of course! I suppose that I was just hoping that there was some obvious flaw in my design, or some well-known pitfall that might be causing trouble.
_WeirdCat_ said: What struggels me mostly is how the camera follows the object, code seems like have realtime follow with no stack that is followed, fo me updating camera should be unattached, in ex put in a thread following changes of object position. I can clearly see a misunderstanding. For now it seems it's strictly attached to object position, so maybe object jitters not the camera, however….
I'm… not sure that I follow what you're saying here. (In all fairness, it is late here, and I'm rather tired.)
The camera isn't directly attached to the target, at least in the sense of being connected below it in the scene-graph. The target and the camera are two separate scene-graph nodes.
That said, I have checked the positions of both the target and the diegetic UI, and they've seemed as expected, so I doubt that the jitter comes from them.
Which brings to mind: one thing that I have to hand right now is a graph that I generated.
It shows the y-axis positions (the numbers on the left) of both the target and the camera. (I tried to hew close to the y-axis for the experiment that generated its data.)
If I'm not much mistaken, this data was taken under constant motion, with effectively little or no acceleration.
Alas, I forgot to check before closing the spreadsheet which line was which, but I think that the red line is the camera, and the blue the target.
I'm guessing that the jitter is those points at which the red line approaches the blue, then falls away.
This is definetly causing the problem, first you move object then camera, causing it explictly to sometimes match the object postion.
When object stops moving camera does not, it reaches object position, then my bet it jitters due to float inprecision.
And instead reaching object position you get diffrence between -0.5e7 to 0.5e7
You need to damp the movement of camera when its too close by if (vectorlength(diff) ≤ threshold scalar = 0.0; → this is wrong damp but may work i rather go with damp(diff*dt)
Where damp function is if diff*dt ≤ threshold then do not update cam pos. This won't let you reach object position.
But still you move by distance of the object through dt, you might use
Normalized diff vector which gives you direction of where camera needs to go then multiply that by object last frame velocity dt and add that to camera position, with checking if camera aint too close if it is then stop movement this may require intensity damping this means the closer camera gets damping gets bigger,
Dont read the quote
however with this approach you update velocity vector by adding force (acceleration) normalized(diff) *dt
In summary your camera needs velocity vector and position vector
Where accel is your diff vector
vel = vel + damp(accel)*dt;
pos = pos + vel*dt;
Then you render the scene. From my understanding you want camera to smoothly follow object (like from third person camera)
However i see a revelant bug here with forcing camera to follow object pos each frame but i don't see it yet, my guts are telling me that…
Just tell me what you expect that camera will do when object moves forward and then rapidly to the left and continues movement in left direction. What happens at this rapid directio change) Do you want camera to follow the path, or is camera positioned behind object and rotated towards object and you want it to move smoothly through a curve. It's unclear on: how do you expect/want it to behave. Tell us and youll get your answer
BTW how far are you from the origin? If you are using float and are much more than a dozen or so km, you can start to have precision issues which might be what's causing your problem.
_WeirdCat_ said: When object stops moving camera does not, it reaches object position, then my bet it jitters due to float inprecision.
The thing is, this is happening under constant motion. Indeed, it primarily happens under constant motion, from what I've seen.
In addition, the jitter looks to me, at least, to be rather larger than of the order ~10^-6.
Gnollrunner said: BTW how far are you from the origin? If you are using float and are much more than a dozen or so km, you can start to have precision issues which might be what's causing your problem.
That is a thought that has occurred to me, indeed. However, I'm only about 800 world-units out, and have had the issue appear much closer to the origin, as I recall.
However! I think that I've done it!
Earlier in the thread, it was suggested that I divide my time-step. And at the time I did so, and found that it didn't help.
But in that attempt I only applied this division to the camera's update--not to the target's. It then occurred to me last night (I think that it was) that perhaps applying it to the target--thus changing the relationship between camera and target over the course of the divided time-step--might be important.
So just a short while ago I tried it--and indeed, the jitter seems to have disappeared as a result! :D
In short, what I did is as follows: In the semi-pseudocode that I included in my first post, I changed “updatePlayer” to be something like this:
def updatePlayer(dt):
newDt = dt
while newDt > someVerySmallValue:
# Update both target and camera by a small, fixed increment
position += velocity * someVerySmallValue
updateCamera(someVerySmallValue)
# Keep doing so until the total delta-time for the frame is all-but
# used up
newDt -= someVerySmallValue
# Perform the update one more time, using whatever delta-time remains
position += velocity * newDt
updateCamera(newDt)
My thanks to all of you who helped in this thread! It has been rather appreciated! ^_^
Thaumaturge said: It then occurred to me last night (I think that it was) that perhaps applying it to the target
Glad you did it : )
Yes, if you subdivide timestep, you should apply this to all variables affected from this timestep. Likely by linear interpolation or extrapolation on the target as you did. Though, currently you will still have some timesteps being much smaller due to division reminder. You could improve this further:
const float defaultSubstep = 1/120.f;
int substepCount = max(1, (int)round(timestep / defaultSubstep ));
float currentSubstep = timestep / float(substepCount); // this way we have slightly variable step sizes, but no reminder so no outliers
for (int i=0; i<substepCount; i++) Update(currentSubstep);
Edit: Looking more closely i see you have no smaller timestep, because you ignore the reminder of dt/someVerySmallValue. But this way your whole camera should go slightly out of sync with realtime, as the reminders are not integrated. May be noticeable, may be not.
JoeJ said: Edit: Looking more closely i see you have no smaller timestep, because you ignore the reminder of dt/someVerySmallValue.
I'm not sure of what remainder there would be: I'm not using division, and even if I was, it would be floating-point division, and thus include the remainder.
(There might be some loss to floating-point precision, but that should be minimal, and be accounted for in subsequent steps due to the target-distance being slightly higher or lower than it should be.)
JoeJ said: Yes, if you subdivide timestep, you should apply this to all variables affected from this timestep.
One thing that I'll note here is that I'm not applying the division to all objects. For example, enemies do not use the divided time-step. It's currently only used for the player's (and thus the target's) position and velocity-handling (excluding player-control), and for the camera's position.
Thus far I haven't seen any ill effects of this, at least.