@supervga You're not wrong. Just trying to provide a minimal starting example so the OP has a basic idea.
On another note, this is usually why you scale things such as animation or user controlled (non-physical) motion based on the delta time difference, to avoid such inconsistencies (sometimes clamping the delta to certain maximums and minimums to avoid extremes). I've come across the article you linked a few times now and it's definitely recommended reading but note that the article is mostly talking about time steps in physics simulations. You usually want to fix your timestep or run multiple physics iterations based on the delta time in order to avoid strange physical inconsistencies. I've tried implementing my own physics engine in the past and often ran into janky collision resolution behavior for this very reason.
But when it comes to rendering you sometimes don't want to limit the framerate of your game and want the rendering loop running as fast as your rasterizer can draw images and your screen can display them. Drawn frames and physics iterations don't always correspond. A game without physics or collision handling (rare but it can happen) might not even require such a loop. On the other hand, if you are developing on non-console hardware and want to provide a fixed framerate for your games you probably want to take suggestions from the fixed timestep camp.
EDIT: Unity just posted an interesting article about how they fixed their delta time in the 2020.2 update