In the article I'm having trouble understanding what I do with Time? Why is it passed into my Update function along with Delta Time? Do I have my variables set up properly?
Some game systems or algorithms benefit from having an absolute elapsed time value from the start of the game/scene/whatever. It's common to pass it along to shaders for some effects, for instance, or to use it for in-game timers.
There are problems with doing it, though, depending on precision. If you ever use an absolute time value, make sure it's a double or an integral value. Floats will begin losing necessary precision for all but the shortest games.
https://randomascii.wordpress.com/2012/02/13/dont-store-that-in-a-float/ movies are only 24Hz, and you don't hear people complaining that movies aren't smooth.
I'm here to complain about 24hz movies. Panning and other action looks like total crap at 24hz. 60hz movies are vastly superior viewing experiences. (except for movies filmed at 60hz but using 24hz techniques, like LOTR - some of those kinds of movies require the 24hz blur to hide or mask low-quality props, sets, costumes, or effects).
Most of the reason for buying a 120hz or more monitor today and having a rig to play it is so you can enjoy the better experience it provides.
The advantages of 120hz are a bit more complex than just running faster. One of the big reasons is that _lower_ framerates are smoother.
Remember that with vsync, you're stuck hitting factors of the monitor refresh rates. That is, a 60hz monitor can only smoothly run at 60hz (1x), 30hz (2x), 20hz (3x), 15hz (4x), etc. This is why we often talk about 60fps vs 30fps games. A 30fps game might actually be capable of running at 58fps but with vsync enabled (a requirement for many, esp. on consoles where the vendors literally require it) that 58fps turns into 30fps.
A 120hz monitor gives you more wiggle room, as it can run at 120hz (1x), 60hz (2x), 40hz (3x), 30hz (4x), 24hz (5x), etc. Note that it has another option between 30hz and 60hz, that it contains 24hz, etc. The even nicer 240hz monitors give even more options, including some nice ones like 24hz, 30hz, 40hz, 48hz, 60hz, and 80hz, among others. This is also why 144hz is a popular option, as it gives you smoothness at 24hz, 36hz, 48hz, 72hz, etc. (notably lacking 60hz, though, which is less than ideal for compatibility reasons for all those games that have foolishly hardcoded 60hz refresh rates). Of course, many of the 144hz monitors can run at more traditional refresh rates too, avoiding the compatibility problems, at least for full-screen/exclusive-mode games.
Also keep in mind that there are displays that run at "weird" refresh rates like 50hz, 72hz, etc., though they are uncommon these days. Depending on your target audience, you might need to consider those monitors and their smooth refresh rate factors.
Of course, with GSync/FreeSync, this all suddenly matters a lot less, since you can get smooth VSync behavior out of an arbitrary update rate (depending on the monitor's range - some only go as low as 48hz before turning adaptive sync off!).
The result is that you're far better off buying a (good) adaptive sync monitor than you are in buying a high refresh monitor, though you're best off buying one that has both (a 240hz monitor with adaptive sync down to at least as low as 30hz - such monitors exist, depending on what price, resolution, and other features you're willing to settle for). Of course, you can't assume all your game's players will have such a nicer monitor, so you'd better be developing for and regularly testing against a more traditional 60hz fixed-sync monitor profile. :)
tl;dr: it's all about the fastest refresh rate a monitor can offer smooth updates for a given game's variable FPS, not just the fastest possible refresh of which a monitor is capable.
A game should not lock itself to a visual refresh rate outside of vsync so it is compatible with any monitor, should use interpolation since it could be run at virtually any refresh rate, and should use a fixed timestep for _gameplay_ purposes to achieve deterministic and reliable simulation, with the timestep chosen for the minimum latency the specific game requires. The game _could_ also sample input more frequently than the graphics refresh rate and apply inputs on physics steps, allowing a game with a 180hz timestep to sample input at 5.5ms intervals despite probably only being able to render frames at 30-60hz on a typical gaming machine; there's usually no reason to bother doing that unless you're a twitch-focused game, though.