Do you mean to say that clients are simulated at the same tick rate as non-client actors on the server, and that they rely on a constant stream of input? I don't do this, because any network conditions that cause a packet to arrive late / early will force the server to correct a client, who may actually be "right" in most cases.
I suggest that you focus on solving problems you're actually having, rather than problems that you predict you might conceivably have in the future.
Specifically, jitter on the Internet is actually pretty predictable -- for most game sessions, the amount of jitter will be approximately constant, and thus a constant (measured or derived) amount of de-jitter buffer is a good solution. When jitter suddenly spikes, it's typically in connection with packet loss, and thus trying to compromise the design of your network protocol to give a marginally better experience in a rare, gonna-suck-no-matter-what case, is often not the right choice.
The other cause of timing drift is clock skew. Clocks in computers are actually pretty accurate, but there will be some amount of drift -- say, one second in eight hours of gameplay for a pretty crappy clock. This means one tick every eight minutes, if you're simulating at 60 Hz. Thus, "bumping" the clock a little bit when you detect that it's ahead or behind, by detecting packets being ahead or behind, is often a good solution.