Advertisement

UI Library - Overlapping widgets and Z ordering

Started by January 22, 2022 07:46 PM
4 comments, last by Alberth 2 years, 10 months ago

Heyo, I'm refactoring my UI lib to support widgets that can be dragged on top of eachother, and am wondering if there's a conventional way to solve this.

I currently have a simple, naive widget graph. It starts with a “Screen” construct, which the developer can add child widgets to (then add child widgets to those widgets, etc). When I'm propagating events through this graph, I iterate through the top-level list of widgets doing hit tests on each one, and I follow the first path that returns true. I continue until I either get no hits, or I hit a leaf widget that handles the event.

Because of this setup, when I have 2 widgets on top of eachother, whichever one happens to be earliest in the graph will get the event. Instead, I need a way to set one widget as being in front of another, and I need to use this info to pass events appropriately.

I know about Z-ordering and figure it's probably the solution, but I'm getting confused on the specifics. I assume you have the developer set Z values on each widget and follow an event propagation algorithm like:

  1. Iterate all children and build a list of the ones that are hit
  2. Compare the Z values of all hit children and pass the event to the one furthest in front

It seems like that would work fine, but how does setting the Z values actually work? Do children inherit their parent's value? Do I make a “Window" widget and only attach Z values to those? Do values get changed as things get dragged around? Is the default value somewhere in the middle of the range, or all the way back?

I could use some help getting on the right track. Thanks!

In the UI part of a game I wrote, I have a stack of windows (where the front-most window is at the top of the stack), and each window has a tree of widgets where child widgets are above parent widgets. Windows live in screen space, widgets are positioned relative to the top-left of their window, but their sizes are the same as screen space sizes. [[EDIT: Note that a window is just an administrative rectangle for its widgets, only the widgets are visible.]]

Finding a widget from a position at the screen thus means going down the stack until you find a window that covers the position, then go down the widget tree using the relative position within the window.

Clicking a window moves it to the top of the stack, dragging it then just changes its top-left position.

This is mostly mimicking a regular UI widgets set [[EDIT: A simple version of what you'd find for writing a desktop GUI application]]. For many games it's likely overkill, but it depends on the game that you write.

Advertisement

Don't bother with numerical z-values. You already have some sort of data structure that contains the widget tree, right? Maybe an array of top-level widgets, plus an array of children for each of the widgets, or maybe a single global array of all widgets? This data structure should be, by definition, z-sorted. You change the z-order by moving elements up or down this data structure. Iterate front-to-back to find the first widget at a given (x, y) position. Iterate back-to-front when rendering.

a light breeze said:

Don't bother with numerical z-values. You already have some sort of data structure that contains the widget tree, right? Maybe an array of top-level widgets, plus an array of children for each of the widgets, or maybe a single global array of all widgets? This data structure should be, by definition, z-sorted. You change the z-order by moving elements up or down this data structure. Iterate front-to-back to find the first widget at a given (x, y) position. Iterate back-to-front when rendering.

That makes a lot more sense. Between your post and Alberth's, I think I'm almost at a complete design. Is it conventional to have events go front→back and rendering go back→front btw? I have it set up the opposite, but if it's convention I can easily switch it.

Alberth said:

This is mostly mimicking a regular UI widgets set [[EDIT: A simple version of what you'd find for writing a desktop GUI application]]. For many games it's likely overkill, but it depends on the game that you write.

In this type of window-based system, is there a conventional solution for handling unhover/deselect-style events between windows? For example, if I select a widget in one window, then click in another window. The selected widget from window #1 should be deselected, but that window would never even get an event since it'll go straight to window #2. I have an idea of how to solve it (detect when you hover over or select a new window and send a special event to all widgets in the old window), but I'm wondering if it's already a solved problem.

Net_ said:
The selected widget from window #1 should be deselected, but that window would never even get an event since it'll go straight to window #2.

Huh, I have a central stack of windows, and you need that to find the second window. So at that spot you can simply store that some other widget is selected. Note you also need that for redirecting keyboard input to eg a widget for entering some text.

Edit: Or rather, the central class stores that another window has a selected widget, and the window knows which widget.

My widgets are light-weight, they have static data for display only, data that doesn't change often, such as colors and non-changing text. Anything else comes from their Window class which typically grabs that directly from game state data.

This topic is closed to new replies.

Advertisement