Thaumaturge said:
For myself, I… generally don't like motion blur in video-games; I prefer the view to be nice and crisp.
(Note that, if I'm not much mistaken, there's little such blur in human vision--some, but not much. The heavier blur that may come to mind is perhaps an artefact of familiarity with cinema, the cameras for which are I gather more prone to blur than is a human eye, and where it helps to cover the relatively-low frame-rate.)
Most people don't like blurring effects, but that's caused from exaggerated use of it, i think.
For example, we could generate a blur along motion by always keeping the last two frames, and displaying a mix of those two plus the current.
This would give us an effect of motion trails (but also lag), like i have used in the title screen of my game. Or the unwanted ghosting effect of TAA, which is the same thing.
That's not what we want for correctness, agreed.
Real world cameras do better, but rarely perfect. Afaict, they have their shutter open for a period of time. Setting this time affects both motion blur and exposure. The longer the shutter is open, the more light accumulates on the image sensor.
So if they use a period which is half as long as the frame to get the right amount of exposure, they capture only half of the time period they want to get the correct amount of motion blur. The resulting animation is much better than having no MB at all, but it's still not really smooth either.
For offline CGI it is common to use the whole period a frame lasts, which is ideal. It samples a temporal segment of signal we try to capture, not just a simple point in time, which is right.
I assume, if you can do this correctly, you won't find many people who complain about the blur. They should not even notice the blur. They should only notice the animation looks smooth, and the quantization to still image frames is less noticeable then in other games.
That's tangent to other truly subjective things, like DOF on / off, sharpening filter on / off, or general preference of sharp vs. smooth images.
Thaumaturge said:
it hadn't occurred to me that we might be talking about pixel-perfect movement, which was perhaps silly given the mention of pixels. ^^; In that case, indeed, I'd be inclined to suggest movement-speeds that are integers when measured in pixels-per-frame.
But that's a very hard limitation on game design. Nowadays things mostly have to move at gradual speeds, like in the real world. But if we still want smooth animation for that at low FPS, MB is the only option to get it.
Thus, even people disliking blur in general should keep open minded about MB. I mean, currently we can not really afford it, mostly. But the time will come, assuming chips can still shrink a bit.
And the current standard is not bad either. Most games have SS MB, and it helped a lot with making them appear smooth when this came up.
But i really miss it in 2D games. Playing a Super Mario clone a lot, my biggest issue with it's gfx is not the old school pixel art, but the chunky background scrolling.
I'd love trying to fix this with MB, but not entirely sure how noticeable the blurring would be.
Thaumaturge said:
Of course, if the frame-rate varies then once again there's likely to be some unevenness in the animation, I fear.
Which even MB can not fix. I have no variable refresh rate display, but i don't believe this can fix fluctuation in frame rates either. It may however help if you miss the 60 fps target just slightly, e.g. when getting just 50 fps for some time.
I also doubt blowing 30 fps up to 60 with something like DLSS3 frame interpolation can help much. This only makes sense if we already have 60 fps, but eventually a display supporting 120.
We really want a high, stable and constant frame rate first, imo.
Thaumaturge said:
Hmm… Which may be covered, to some degree at least, by an intentionally low animation frame-rate: Let's say that we have an animation running at 6 animation frames per second, i.e. with ~0.17s between animation-frames, and 10 rendering-frames passing per one animation-frame. Then the difference between a rendering frame-rate of, say, 58FPS and 60FPS--i.e. between ~0.0172s and ~0.0167s per rendering-frame, coming to ~0.00054s, is a small fraction of the total time per animation-frame--specifically, about 0.0032, or about 0.3%. As a result, the variation might be less noticeable.
Yeah, as you said earlier:
Thaumaturge said:
As a result, it may be wise to incorporate the time since the last frame (the “delta-time”) into your animation and/or movement code.
We want to sync to real time, no matter what's the actual frame rate. This also includes animation and physics interpolation / extrapolation. If the animation has a key frame only each second, we likely want to interpolate between key frames. If the physics are 5 ms ahead display time, we need to interpolate with the previous step. That's the most important and will resolve most issues. Though, i can't remember any game which failed on this. It's standard, and everybody gets it right to this point. It's usually good enough, but if we want silky smooth animation beyond that, we either need crazy high FPS or MB. The latter should be cheaper if we accept some error.