Advertisement

Theory for advanced AI in games

Started by August 11, 2011 06:06 PM
11 comments, last by IADaveMark 13 years, 3 months ago
Most developers deal strictly with stats, and don't expand too far past that. Stats are vital, of course, but so is the layer on top of them; the personality. Psychology can help define how an AI interacts with the world around him. The personality should be very dynamic, and flow depending on the players actions, environment, or other AI.

Personality should derive from what the developer feels are necessary decision making traits for their AI. If for example, we look at the game Oblivion, the AI was developed to walk around town and look busy. However, it was far too scripted, and the AI was about as basic as AI could get. Of course, processing power, and the like plays a role in these calculations, but for this example, lets ignore it. In order to spice it up, You can assign a few personality roles to each AI. For simplicity, lets just use 3. Friendly, Angry, Neutral. Then below that layer, add stats. The number of stats, like their personality traits, is limitless. The more you have, the more complex the AI becomes.

Stats for each AI could include things like [player/environmental interaction stats]:
bad outcome in last hour [negatively affects happiness]
good outcome in last hour [positively affects happiness]
interactions [if friendly, + affects happiness. if angry, - affects happiness]
Distance traveled [boredom, fatigue]
etc.

Stats would be recorded for whatever the designer feels is important. Each stat will affect a preconceived list of "decision making" stats or "interaction stats. When an AI interacts with a player or something in the environment, these "interaction stats" would be used in the computations for success or failure.

Back to the Oblivion example, The designer will originally denote a personality to a NPC. For this example, they choose "Friendly." Friendly's internal code will interact differently with the stats. If interactions is very high for the day, then a Friendly NPC will be happier. The code would look something like: FriendlyValue*Interactions = happiness (extremely simple, but you get the idea).

Now you can hardcode quests/secrets into the AI for the player to interact with. If AIhappiness > 50 then respond with "Here take this free item"!

While the outcome of this system is similar to what is typically done, the expansiveness is unparalleled. You can make an extremely complex AI using by:

1) Assigning a Personality Trait or traits
2) Assigning important stats to keep track of
3) developing "interaction" formulas that relate traits and stats to what the AI interacts with.

With the basics lined up, I'll try and do a full blown example. I'll start by listing the AI traits, and the requirements of the object the AI will interact with.

AI:
-Trait
Friendly - 80%
Anger - 5%
neutral - 15%


Stats (based on last hour)
-Drinks: 0
-Food: 5
-Interactions: 3
-failed interactions: 8
-money: 5
-good outcome: 0
-bad outcome: 8
-neutral outcome: 3

Decision makers (out of 100%):
-Thirst - 20%
-Curiosity - 60%
-Boredom - 80%
-happiness - 50%

Thirst is derived from the algorithm: on update: (oldthirst+(( drinks*.02)-.05))
So, it will always decrease by 5% if the NPC has no drinks. Traits have no bearing on thirst, so its not factored in.

AI gets buzzed at 20% thirst to find something to drink. So he jumps into the pathing routine and maps out his way to the closest vending machine. He walks towards it as the player watches..."i wonder where hes going?"

The AI arrives. Now he interacts with the vending machine code:

Vending machine code:
  • -Refresh cans once a day
    • Current Cans:
    • Sprite:0
    • Coke:3
    • lipton:1
    • On proximity: Spark AI "am i thirsty?" routine.
      • Essentially, when an AI passes in 'x' proximity of this object, it alerts it to check and see if its thirsty. This check can be complicated or simple, but will be based on the derived decision making stats of the AI. Obviously we'll want to check "thirst"
        • decisioncheckvalue = 100-(.2[thirst value]*100)
        • randomized decision = decisioncheckvalue*rnd(+-5%) [can base this on anything..including other stats...like "cheap" which would relate to dollars]
        • if randomized decision > 70 buy drink
        • the AI would have an 80 stat, plus or minus 5%, thus they would get a drink.

          However, our AI already is headed there for a purpose, and thus, would skip the close proximity reaction (which would occur for almost all objects in the game that the AI can interact with.)

          The AI arrives, and already wants to buy a drink. Maybe in his AI, its designated that his favorite drink is Sprite. Whenever he drinks spite, he gets a good outcome. Whenever he drinks anything else, its a bad outcome. So he buys sprite, but its empty. +1 bad outcome, then buys coke, another +1 bad outcome.

          The bad outcomes readjusts his happiness, and it lowers. Now a little kid comes by who is scripted to ask players/AI's for drinks. He hits the radius of the AI and buzzes that he has a drink, and starts the interaction.

          Just like the vending machine, there is a scripted "check" to see if the AI will give it to him. To make it as simple as possible, it just checks his "happiness" If 'friendly' then it checks if Happiness is > 50%. If yes, then give soda. Well, out AI's happiness was just adjusted due to 2 bad outcomes, so his happiness falls below 50% and he does not give the soda. The boy thus, gets a bad outcome.

          We can even change up personality traits. If Happiness < 20% for 2 hours, then change trait to angry.

          This is a hard concept to explain in a small forum window. Also, since I haven't developed an actual game yet, the examples are very loose. However, im sure that if you reread it a few times you can see where I am going with it.

          Essentially, in short: the idea is to develop AI with different layers that affect each other.

          Would love some discussion on the subject! Let me know what you think.
Between this thread and your other thread about simulating Physics at the atomic level as a way to achieve AI, I get the impression that you should probably take off your visionary hat and put on an engineer hat if you want to get anywhere.
Advertisement

Between this thread and your other thread about simulating Physics at the atomic level as a way to achieve AI, I get the impression that you should probably take off your visionary hat and put on an engineer hat if you want to get anywhere.


brilliant post..youve really shown me up. na, just kidding, your post was meaningless.

brilliant post..youve really shown me up. na, just kidding, your post was meaningless.


Whatever. Once you build something cool, people will be interested in the theory behind it. In the meantime, [yawn]...

[quote name='fr0st2k' timestamp='1313093519' post='4847875']
brilliant post..youve really shown me up. na, just kidding, your post was meaningless.


Whatever. Once you build something cool, people will be interested in the theory behind it. In the meantime, [yawn]...
[/quote]

Right..because everything amazing in this world is built without any forethought. The house youre living in? Well, someone just decided to wake up one day and throw some pieces of wood and nails together! The computer youre using? Just a random assembly of electronic parts.

are you kidding me? What have you dont with your life that youre so proud of? Maybe you should try using your brain and give some of your programs thought so that people might actually feel the urge to use them.
Write a program. Make a game. Make an AI. Make a physics simulation. Just simple stuff (I've done them all).

After that, come back. But I guess you won't, because you'll realize how/where your theories are flawed. Basically, you are having quite straightforward theories that require enormous computing power, that we probably wont have for many years if not decades/centuries. It's quite hard to explain really. That's why it is so hard (impossible) to persuade the dudes who come up with the "compressing any file to 2 bytes" """algorithm""" ideas.
Advertisement
Keep it civil and refrain from the personal sniping, please.

Wielder of the Sacred Wands
[Work - ArenaNet] [Epoch Language] [Scribblings]

You are describing AI techniques that were used 10 years ago in The Sims, Black & White, and numerous other games since that period. Additionally, these are simple techniques that are the subject of my book and many of the lectures I've done. Suffice to say, it is neither visionary nor even "new" or "advanced". Of course, if you had done any study on the topic at all, you would have known that.

Go look up:
  • smart terrain (as in The Sims)
  • reinforcement learning (esp. as it relates to Black & White)
  • utility-based decision modeling
  • weight-based random selection


Dave Mark - President and Lead Designer of Intrinsic Algorithm LLC
Professional consultant on game AI, mathematical modeling, simulation modeling
Co-founder and 10 year advisor of the GDC AI Summit
Author of the book, Behavioral Mathematics for Game AI
Blogs I write:
IA News - What's happening at IA | IA on AI - AI news and notes | Post-Play'em - Observations on AI of games I play

"Reducing the world to mathematical equations!"


After that, come back. But I guess you won't, because you'll realize how/where your theories are flawed. Basically, you are having quite straightforward theories that require enormous computing power, that we probably wont have for many years if not decades/centuries. It's quite hard to explain really.

Huh? See my post above. As odd as it was for the OP to suggest that his ideas were "advanced", your reply is just as odd for suggesting that something already done long since is "flawed" and computationally huge.


Dave Mark - President and Lead Designer of Intrinsic Algorithm LLC
Professional consultant on game AI, mathematical modeling, simulation modeling
Co-founder and 10 year advisor of the GDC AI Summit
Author of the book, Behavioral Mathematics for Game AI
Blogs I write:
IA News - What's happening at IA | IA on AI - AI news and notes | Post-Play'em - Observations on AI of games I play

"Reducing the world to mathematical equations!"

I believe Szecs was referring to the OPs other thread, specifically.

throw table_exception("(? ???)? ? ???");

This topic is closed to new replies.

Advertisement