Hello Designer Community,
we havent seen for a while but now, as I'm doing some more advanced game design again, I thought visiting you to get some feedback
[deleted by moderator]
As I'm in the design phase now, I had some ideas/thought about the HUD and input system of the basie game engine Spark is carring with it. When playing games like multiplayer/online/mmo RPGS in the llast time, I always see screens full of overloaded HUDs, icon-bars in every corner (especially for mmos) and more text as in my last taxing sheet. The question I asked to myself was "why dont make a game's HUD/input system as simple as possible? Sure, it sometimes depends on the kind of game like for an mmo RPG you have bars over bars with user triggered skills/abilities but is this really necessary or did it evolve during years and centuries of game design?
While also taking into account to be VR ready in the future and platform independent anytime, I designed a simple "touch" based input system and HUD with the philosophy in mind "everything is a touch and anything should only be a touch away". Some differs in VR to "traditional" control input, thats right but I think that this just equalizes both for some kind while you are free to touch in VR, in traditional control schemas you are bound to what do you focus.
Anything in the game world may have a touch point to trigger either the one action it is capable to or to show a menu for choosing whatever action is capable to be done. You do this in traditional input with the camera focus and an action button (mouse or similar on gamepad). The number keys or shoulder buttons switch between hands (for example 1, 2 or L, R on gamepad switches to either left or right hand while 3 or bot pressed at the same time on gamepad activate both hands) to decide what hand and/or item to use.
The HUD is separated into the static interface just showing some player information like HP, buffs/debuffs as long as necessary (they will disappear if HP are full or buffs/debuffs delay) and the fully touch friendly interface that is opened with the zipper gesture or the left mouse/inventory key on gamepad in traditional input. A full handed swipe gesture or the same inputs on traditional input system will close a menu. Anything else is also touch controlled as above.
A claw gesture or the quick key in TIS openes a quick access menu.
In summary, I have
-
Mouse/Keyboard
- M Right (Use)
- M Left (Menu)
- W, A, S, D (Movement)
- 1, 2, 3 (Switching Hands)
- Crouch/Sneak Key
- Jump Key
-
Gamepad/Switch
- Right Stick (Movement)
- Left Stick (Camera)
- Right Shoulder Button and/or Left Shoulder Button (Switching Hands)
- Use Button
- Menu Button
- Crouch/Sneak Button
- Jump Button
What I'm left off now is if player triggered skills/abilities/spells/whatever matches in this category exists, how to do this in VR and how to minimize input complexity (like having 3 swicthable bars with shortcut keys) in TIS?
Am I missing some functions features (like blocking?) and where/how would you place it in VR/TIS?
Thanks in advanced!