In my case, the physics simulation is (and probably will be) very basic but still I think that having a different tick rate will raise problems. Also, as long as I'm aiming to support a low number of players-per-server, I won't probably need to lower the tick rate there, and running physics at 30 and graphics as fast as I can with smoothing already proved to be a good solution.
What about how to deal with dropped or out of order messages delivered to the server btw? Reading again your answers, I think that I like the idea of delaying the execution on the server, as long as it should allow me to keep the server code simple and avoid to have to store the past entities state. As long as I'm planning to use a client-server model also for single player or local multiplayer, maybe I could adapt the delay on the server based on the average latency, or even better set the delay to 0 when playing single player or on a local network, and set it to something more meaningful when running online. Does it make sense?