8 hours ago, Josheir said:
What I'm trying to say is that checking for receiving a message would need to be in a loop, and my prior message was about this, the ineffectiveness of a (render) loop when using WXWidgets.
A desktop application is normally designed to be well-behaved among the other desktop applications. You try to reduce memory usage and CPU usage, giving room to other desktop applications to do their work. A computer running a desktop that is not being used has an idle CPU. Everything is blocked, waiting for the next action of the user. Games on the other hand, want to have as much CPU and resources as they can, to give the user the best possible experience. They boldly assume they are the only application running, and burn CPU cycles like there is no tomorrow. In other words, the basic premise of a desktop application is the complete opposite of a game.
So the basic idea of a desktop application is to block until something "interesting" happens, an event. Then you deal with the event as efficiently as you can, and you block again, waiting for the next event. You don't loop, you don't poll the network to check if something happened, you block until the next event. For a cashier application this makes a lot of sense of course. If you open the shop at 11 in the morning, and the first customer arrives at 4 in the afternoon, there is no point in wasting 5 hours CPU time checking that nothing happened and rendering the exact same display at 50fps for 5 hours. Even when in use, a user enters a handful of key-presses every 10 minutes in-between pouring and serving drinks, walking between kitchen and customers, taking the dishes back to the kitchen, and cleaning the tables for the next customer (or in your case, a handful of mouse clicks).
Can you have a networked, animated desktop application? Sure you can. You deal with the network by blocking until data arrives or until there is room to send outgoing messages. Unix has a "select" call for exactly that purpose http://man7.org/linux/man-pages/man2/select.2.html Windows likely has something similar. Some GUI toolkits have this a standard event for notification about network activity, not sure if WxWidgets has that. If not, you can always setup a separate thread that deals with the network, and pushes events to the main loop to let the main application know it should check the received messages setup by the thread.
For animations, you need some form of regular event arrivals. Some toolkits have timer events for this purpose, others have an idle-event that you can use. Of course a thread with a timer that pushes events is also an option.
Note that you have very few time guarantees, your application may be minimized, or the computer may be busy running a different application. It's a natural result of being one of many applications running.
As for inefficiency, how do you define that? Above I just argued that games are highly inefficient in their CPU usage when nothing of interest is happening. With the rare occurrences of events (users and network are just plain terribly slow from a CPU point of view), event-driven updates are extremely efficient. Note that as the CPU is normally idle, when something happens, you generally do have full CPU available as well. You'd do animations at a plain canvas generally, if you go through the event system it's indeed less fast, but really, how much animation do you need to press a button? More modern toolkits integrate the GPU into them, so you can have hardware-supported rendered surfaces in the application, eg see JavaFX2 (just pointing out its existence, no real experience with it other than a bit playing with it). Likely other toolkits do something similar.
As a question to ponder on, isn't it fun that a game burns CPU cycles like a madman finding that the user or the network didn't do anything for about 50-70% of the time, and then find we are running short in CPU time elsewhere?