Advertisement

AI and Needs

Started by November 13, 2008 08:24 AM
3 comments, last by abachler 16 years ago
Hello, I'm working toward implementing a dumbed-down version of Maslow's Hierarchy of Needs as the driving force for NPCs in my game. Essentially, NPCs will have a series of Needs, and what they do throughout the day will effectively increase or decrease these Needs. I am not attempting to simulate reality, but rather to present believability. The following is a set of definitions for limitations of the system: - Every Need ranges from 0.0 to 1.0; the lower value indicates a completely met Need, the higher value indicates a completely unmet (and as a result, requiring immediate attention) Need. Need values may not exceed these boundaries. - Needs may be modified up or down, at varying rates. - Needs are affected by use of items (e.g. eating food, consuming poison), performing certain actions (doing a job, donating food to the poor), encountering certain situations (walking by a welcoming scent of a bakery, observing a crime in progress), or time progression (physical healing, or become hungry over time). - Needs are hierarchical; high-importance needs supersede low-importance needs, i.e. if a NPC is engaged in a lower-importance need, and a high-importance need suddenly materializes (e.g. a worker doing his job becomes hungry), then the NPC will pause to deal with the higher-importance need (e.g. the worker will take a break to have a snack) before returning to work. - Every use of an item, performance of an action, encounter of a situation, or time progression (e.g. "Affectors") will have a "Need Template" associated with it (e.g. a list of all Needs, and how much of each Need affects the affectee). Affectors do not necessarily affect all Needs at one time (and likely only affect a small number). - While developing this idea, I came up with the concept that, in order to control whether an NPC fits into society or not (e.g. how peaceful they are) is determined by a factor outside of the items described above. Essentially, an NPC can also have a "how much do I really care about each Need" set of values which influence how likely they are to perform certain actions that *could* negatively impact other Needs. For example, let's say NPC #1 reaches Food >= 0.9, which causes him to seek out food. When he finds it, he will determine whether stealing the food (if his Morality=0.0) or paying for the food (Morality=1.0, SecurityOfWealth=1.0) will be the course of action he takes. I'm not sure how far I want to pursue this, since this complicates this already complicated system tenfold, but it is food (ha!) for thought. Your ideas?
Sounds pretty standard. The devil's in the details, as usual.

An NPC, halfway to work, will suddenly reach the threshold of hunger where he turns around and goes home for a snack, not having planned to be hungry for breakfast. Another NPC, walking by a bakery, decides to go in, realizes that he has neither the money to buy food nor the immorality to steal it, walks out, and walks back in to repeat the process perpetually (until he starves). A third NPC, less moral than the second, steals a loaf of bread, observes himself doing so, and proceeds to chase himself around.

All of these potential problems have simple solutions, but they serve as an illustration of the fundamental issues with the "nuanced AI through lots of multipliers" approach. These problems may occur only under extremely specific conditions, are often difficult to reproduce, and are difficult to fix without introducing new problems. If you want believability above all else, start by defining exactly what situations NPCs will be allowed to encounter, then give them only the intelligence to deal with them and not an IQ point more.
Advertisement

Every Need ranges from 0.0 to 1.0;



You might consider having it range from -1.0 .. 1.0 so that you can have the need met in excess (at times) so that further actions for it can be delayed (0 being the point where the need could be acted upon).


You might also have a priority curve for each need which spells out how inportant the need is at each value (priority escalating as the need get greater).
Objects can be customized by adjusting those curves so that the object will act on different priorities at different points (a pessimist will have higher priorities lower on the scale and an optomist/risktaker would delay raising the priority until its need factor is nearing 1.0).

You can use histograms (array) instead of a continuous curve to simplify the calculations.
--------------------------------------------[size="1"]Ratings are Opinion, not Fact
DarkHorizon,

I'm working on a research project doing almost exactly what you've listed for a multi-agent system (in fact, Maslow's Hierarchy is mentioned a few times in my proposal).

I've handled the "how much do I really care about each Need" by utilizing personality temperments (Keirsey, Myers-Briggs, etc), which are a vector of 4 integers -1 to 1. But basically, for each need I assign a "base" personality that is mostly attracted to the need. This base is then compared to the personality of each agent to decide how much the need will grow over time for each agent.

This allows a bit more diversity in how agents respond to growing needs and solves the "starving artist" problem often cited with Maslow's theory.

As was pointed out to me, you may want to look into some articles that talk about how The Sims works, because it's basically a simplified Maslow's Heirarchy.

Just curious, but is that Cambridge England by any chance? I'm in the US, but I do quite a bit of work for various companies on that side of the pond.

Nice to see someone else working on this idea though, makes me feel better about my own work heading in the right direction!

Hope your project turns out well,

--Rhalin
Needs need to have thresholds for each possible solution. So that stealing a loaf of bread, while abhorent to an NPC, will still trigger if that is the only option available and the need goes unfulfilled long enough.

This topic is closed to new replies.

Advertisement