Advertisement

New twist on the life game idea

Started by November 15, 2001 08:08 AM
10 comments, last by sunandshadow 23 years ago
There's been a lot of discussion in this forum about was a perfect and completely complex life simulator game or god game would be like - how much realism would be a good thing, would this king of game even be fun to play, etc. Here's a different angle to get at this from: Assume life IS a game. A MMOsim, if you will. Assume also please that there is no afterlife, because we cannot scientifically observe such a thing so we can't make any logical conclusions about it. Now assume that life has a scoring system (call this emotional economy), and that you act to maximize your score all the time, usually without noticing. For example, you are nice to your friends out of habit, but this has the indirect benefit of them continuing to express approval of you, and when you feel this approval you feel good about yourself - you make an emotional profit. Conversely, if your friend yells at you you feel bad about yourself and take an emotional loss. There are assorted other psychological needs and drives that are rewarded this way. How do you lose the game? Ignoring natural or accidental death, you can lose the game by going emotionally bankrupt - getting so depressed that you go catatonic or commit suicide. More forces are at work here: you can't just keep stacking up an emotional profit. Why? Because your emotional balance undergoes a radioactive-type decay over time toward your average mental state. This average differs by individual brain chemistry and personal philosophy, varying from mild displeasure through boredom to contentment. So what do you think? Silliness, or useful theoretical model? Can anyone add any details to this construct? edit: spelling of loose corrected to lose to prevent bishop_pass from getting overexcited Edited by - sunandshadow on November 16, 2001 12:22:25 AM

I want to help design a "sandpark" MMO. Optional interactive story with quests and deeply characterized NPCs, plus sandbox elements like player-craftable housing and lots of other crafting. If you are starting a design of this type, please PM me. I also love pet-breeding games.

Hmmm, useful theoretical model, I think. Lots of philosophy out there that is very similar to this idea. Slightly more interesting idea (to me, at least): incorporating "emotional profit" into the AI for your next big game. How cool would it be to see bots acting in their own best interest (not just survival-wise, but emotionally).

FragLegs
Advertisement
I''d love seeing philosophy applied to games. I don''t however like the "the purpose of life is to be happy" thing a lot. It doesn''t work for a lot of character types, one of them being the hateful evil overlord (and what''s a game without an evil overlord?), and the scoring system is flawed (a Christian saint gets a much greater score than Napoleon for instance).

IMHO, a philosophy that would work better and would also make more sense from an algorithmic point of view would be Nietzsche''s "the purpose of life is power". The power of an individual would be the sum of his raw power (brute force, wealth, working skills, beauty, health etc.) and the influential power he gains from relations with other people.

The emotions of a person would be results of the power levels :
- someone gains power -> he is happy
- someone loses power -> he is sad, or angry
- someone doesn''t see any chance to increase his power -> depression, suicide, or other desperate actions

All relationships between people could be described as changes in the influential power relations between individuals:
- x fears y -> y has a certain influential power over x. His power increases, while x''s power decreases
- x and y love each other -> similar with an alliance. Love increases the influential power x gets from y and the influential power y gets from x.

All the actions in the game could be described as their power effect:
- y, z, t and r become friends -> their power increases because they have many friends
- x kills y -> his power increases with the wealth he steals from y, but also because of the fear from z, t and r
- z, t and r put x to prison -> now their power increases because they are no longer afraid.

Actions can also be motivated by their power effect.

Such a system would even explain martyrs: dieing would reduce a person''s raw power to zero, but his influential power can still live on.
Or you could just go back to the original idea of emotional profit and base the profit/loss on the persons character.

ie. evil overlord get profit from seeing people suffer, having underlings that kow-tow to all his wishes, etc.
ie. super hero nice guy gets profit from helping people, having friends, etc.

---
Make it work.
Make it right.
Make it fast.
"None of us learn in a vacuum; we all stand on the shoulders of giants such as Wirth and Knuth and thousands of others. Lend your shoulders to building the future!" - Michael Abrash[JavaGaming.org][The Java Tutorial][Slick][LWJGL][LWJGL Tutorials for NeHe][LWJGL Wiki][jMonkey Engine]
I agree with Captain Jester. An emotional profitbased system would allow for a huge amount of flexibility in what drives certain characters. Everyone could have a happiness metric. This way, the emotional profit idea is actually a superset of Diodor''s Nietzsche-based idea: any "supermen" in the game could have a happiness metric based on power. In fact, with different characters working towards different, undefined goals, you could wind up with emergent collaboration (two characters find that working together brings both of them closer to their two different goals). This is kind of neat.
sunandshadow - very cool idea...but I have to question the strength of such a system if the game is a MMOsim...I could see a single player version of this working very well...but an MMO version would be hard to implament (at least if the game would be built off the same charactor interaction basied ideas as MMORPGs)

takeing two players who might not really like each other and forceing them to be nice and supportive to each other because it is of mutual benefit to the characters they play...well...it doesn''t seem like it would be a acceptable expectation of the online gameing comunity....well...at least not to a online community that is more or less forced to endlessly fight monsters...and never questions such actions....

But, I''d love to hear more
Advertisement
A more correct scoring system would be short-term avoidance of anxiety. Most humans behave in a way that can be explained by this, they work at all time to keep in a local minima of anxiety, which most often leads them away from any global minima to anxiety.

People involved in cults show this behaviour best: It is easier to live a lie than to face the fact that you have been fooled. Or rather at any given time it costs less anxiety to keep to the lie than to break it, but the total anxiety over time is much greater when you live the lie.

Real humans very often work with their own interests in the front seat. We are shortsighted and scared, and if the AI is to be used to model realistic characters then they too should behave like we do. (Lovingly dysfunctional as I like to say.)
I agree,
That is a really good idea. With the emotional profit, I would make it so that not only can you have to little but to much. When the charicter becomes to happy they could start to loose mental stability. That would become even more realistic as a chain reaction of to much self ego makes the charicter plumit into self loathing and mutilation. Quite good indeed.

Grunt as much as possible today
-The zombie of the underworld, Mr. Dude
In capatalisim man exploits man, in socialism it's exactly the opposite
-Ben Tucker
MSW, we were discussing this as a potential for AI, so it really doesn''t apply to MMORPG games (except for whatever AI is in the game), even though the idea spawned from relating MMO to real life. Actually, this system is implemented by default in any MMORPG for the human players. That is, I think most people play video games with some sort of happiness goal in mind. Nobody would be forced to work together, unless being forced to do something floats your happiness metric boat.

MrDude, I like the idea of too much of something causing adverse reactions in the same way that too little of something does (very Zen). Perhaps personal image (i.e. ego) should be one of the variables that go into a person''s metric, along with health, entertainment, etc. That way, you might develop really interesting situations where a person has to steal to eat, but the food in their bellies outweighs the guilt of stealing in their metric, so they steal, but there would be some internal conflict.

FragLegs
quote: Original post by Diodor
... and the scoring system is flawed (a Christian saint gets a much greater score than Napoleon for instance).



I don''t know - let''s look at some examples:
Saint Stephen, well, he was the first martyr. Not happy.
Saint Augustine learned hebrew to keep himself from masturbating. Not happy.
That fellow who was sainted for taking the place of a fellow prisoner who was Jewish in a Nazi concentration camp. Not happy.

And on the other side:
Napoleon conquered Europe. Usually happy.
Napoleon was "imprisoned" in a luxury estate. Not supreme commander of Europe, but still usually happy.
Napoleon had chronic gastrointestinal issues. Not happy, but without his zantac, not much he could to

__KB

This topic is closed to new replies.

Advertisement