My take - love me or hate me ...
(Dons flame-retardant gear before continuing...)
1 : I'm not criticising anybody's methods ... if they work, go for it ... but a lot of the posts I see are how we can make an AI "think" like a person. I disagree with this approach - game AI should really be more along the lines of "mimic" a person.
I mean, if you want an AI that plays like a human, then model a ruleset which allows your opponent to play like you do, and not necessarily interpret information the way you do.
Make sense?
2 : This is what I recommend newbies try as a way of structuring AI : http://www.skoardy.demon.co.uk/rlnews/dev00016.html. In my opinion, OOP can be a really simple, but really effective, NN.
3 : Using multithreading to maximise AI is not a new idea, but it's one of the better ones What I want to know is whether three threads (rendering, interface & AI) are enough or whether I should go less or more. Three work very well, but if I can do better, why shouldn't I?
4 : Just remembered this. I remember reading in one of the longer threads about reputation. I support this idea. For example, just have a value where 0 is pure evil, 255 is pure good and 127 is neutral (plus the grey area between). Then use it to determine who will talk to you, attack you, run away from you, where you can and cannot enter (temples and the like). One stat can replace quite a few. You could also use it to determine what kind of quests you get.
Merrick
Edited by - morfe on 9/8/00 8:50:47 AM
"NPCs will be inherited from the basic Entity class. They will be fully independent, and carry out their own lives oblivious to the world around them ... that is, until you set them on fire ..." -- Merrick
quote: Original post by morfe
1 : I''m not criticising anybody''s methods ... if they work, go for it ... but a lot of the posts I see are how we can make an AI "think" like a person. I disagree with this approach - game AI should really be more along the lines of "mimic" a person.
I mean, if you want an AI that plays like a human, then model a ruleset which allows your opponent to play like you do, and not necessarily interpret information the way you do.
Make sense?
Yes, and I agree. Too many people think AI is supposed to recreate human (or animal or whatnot) psychology so it behaves properly. AI just needs to make it act that way, not actually be that way.
quote:
3 : Using multithreading to maximise AI is not a new idea, but it''s one of the better ones What I want to know is whether three threads (rendering, interface & AI) are enough or whether I should go less or more. Three work very well, but if I can do better, why shouldn''t I?
I suggest minimizing thread count unless you have a really good reason not to. There is overhead involved per thread, so, start to finish, doing two jobs in one thread will take slightly less time than doing one job each in two threads, and if there are blocking dependencies, threading it may cause all kinds of fun problems (slow downs, deadlocks, etc.).
quote:
4 : Just remembered this. I remember reading in one of the longer threads about reputation. I support this idea. For example, just have a value where 0 is pure evil, 255 is pure good and 127 is neutral (plus the grey area between). Then use it to determine who will talk to you, attack you, run away from you, where you can and cannot enter (temples and the like). One stat can replace quite a few. You could also use it to determine what kind of quests you get.
So how do you distinguish between (for example) Democrats and Republicans? Being a democrat doesn''t make you more evil and a republican (it just seems that way ). Using one variable for reputation abstracts out a lot of information that can be used here. I suggest using one variable for each moral set, meaning one variable for Charity/Greed, one for Passive/Agressive, etc. and build a general reputation value from that.
Pax
p
MULTITHREADING??????FOR AI??????????? what i have never heard of such a thing there is no need for it! If you just mimic someone then the ai will never be challenging enouph for a the player! If the character can learn then let him learn! That will make the game much more fun in the future!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
If I wanted to hear the pitter patter of little feet I would put shoes on my cat!
If I wanted to hear the pitter patter of little feet I would put shoes on my cat!
-----------------------------------------------------------"People who usualy use the word pedantic usualy are pedantic!"-me
Um ... the AI should be able to learn, I never said its shouldn''t. I''m just saying that instead of trying to build a system that emulates the way we think, construct one that thinks differently but acts the same way we do,
-------------------------------------------------
"Children come to us in a state of purity and perfection from the great undifferentiated absolute and then, like everything else on this planet, we fuck them up."
-------------------------------------------------
"Children come to us in a state of purity and perfection from the great undifferentiated absolute and then, like everything else on this planet, we fuck them up."
"NPCs will be inherited from the basic Entity class. They will be fully independent, and carry out their own lives oblivious to the world around them ... that is, until you set them on fire ..." -- Merrick
The word for that school of thought in Psycology is "Behaviorism" founded I believe by B. H. Skinner. It forwards the idea that what goes on "inside" a person (meaning in an unobservable way - such as feelings of guilt etc.) is totally irrelivant to understanding the human. If it affects behavior, it can be measured in that way, and if it does not affect behavior, it is irrelevent, and does not necessarily even exist.
I do not really believe in behaviorist psycology, but I DO believe in behaviorist AI. Which means I agree with morfe, that you make the game actions of an AI human like, NOT the interal mechinisms.
WHEN we begin to really have the understanding and processing power to model AI internal working on human brain internal workings (isn''t a roach supposed to have like millions of times more neurons that a computer has transistors? - can''t remember the details), THEN we WILL see a benifit, which would be the benifit of IMMERGENT behavior ... behavior based on the set on systems you provided the brain, and the game rules you place it in ... instead of purely programatic behavior, but right now .. probability tables and learning adjustments seem to be the best we can handle in a game that also renders 3D and runs all sorts of special effects and collision detection schemes.
I do not really believe in behaviorist psycology, but I DO believe in behaviorist AI. Which means I agree with morfe, that you make the game actions of an AI human like, NOT the interal mechinisms.
WHEN we begin to really have the understanding and processing power to model AI internal working on human brain internal workings (isn''t a roach supposed to have like millions of times more neurons that a computer has transistors? - can''t remember the details), THEN we WILL see a benifit, which would be the benifit of IMMERGENT behavior ... behavior based on the set on systems you provided the brain, and the game rules you place it in ... instead of purely programatic behavior, but right now .. probability tables and learning adjustments seem to be the best we can handle in a game that also renders 3D and runs all sorts of special effects and collision detection schemes.
first of all, an AI is not supposed to recreate the behaviour of someone or something. An AI is supposed to think (well, we would love to make it do that).
If you really want to put reasonable pressure on what AI should be, then look at what a computer is : an artificial brain.
Just like *any* other tool the human has created, the computers are yet another mimic of something existing in Nature that they weren''t satisfied with and decided to recreate, to compensate some lack in themselves. We didn''t have strong enough fists to break things, we created bone maces. We didn''t have a thick enough skin, we created armors, we didn''t have wings, we couldn''t run fast enough, we created planes, cars, and all vehicles ... etc
We couldn''t count fast enough, we created the chinese abacus, then hundreds of years later, we created calculator, and now we have computers...
Now, have we really recreated a perfect model of a brain, or only some parts that have been enhanced (I am thinking here, we onl recreated some parts, essentially those concerned with calculus, logic); and in this case, is the model we have of the human brain able to allow us to create another layer of abstraction, a software, that would simulate a *full* model of a brain ?
Think about it, and you''ll see that in fact, I don''t really thnk we can do any realistic sort of AI before we change the way we design computers. And I am not talking about CRAY becoming a family standard, but rather that the hardware itself should change ... after all, couldn''t we somehow compare transistors as neurons ?? Maybe we didn''t evolve the correct brain
youpla :-P
If you really want to put reasonable pressure on what AI should be, then look at what a computer is : an artificial brain.
Just like *any* other tool the human has created, the computers are yet another mimic of something existing in Nature that they weren''t satisfied with and decided to recreate, to compensate some lack in themselves. We didn''t have strong enough fists to break things, we created bone maces. We didn''t have a thick enough skin, we created armors, we didn''t have wings, we couldn''t run fast enough, we created planes, cars, and all vehicles ... etc
We couldn''t count fast enough, we created the chinese abacus, then hundreds of years later, we created calculator, and now we have computers...
Now, have we really recreated a perfect model of a brain, or only some parts that have been enhanced (I am thinking here, we onl recreated some parts, essentially those concerned with calculus, logic); and in this case, is the model we have of the human brain able to allow us to create another layer of abstraction, a software, that would simulate a *full* model of a brain ?
Think about it, and you''ll see that in fact, I don''t really thnk we can do any realistic sort of AI before we change the way we design computers. And I am not talking about CRAY becoming a family standard, but rather that the hardware itself should change ... after all, couldn''t we somehow compare transistors as neurons ?? Maybe we didn''t evolve the correct brain
youpla :-P
-----------------------------Sancte Isidore ora pro nobis !
I believe it''s true that to be able to make efficient use of available resources, while achieving AI on the human level, we would need a different computer design.
However, I believe that they''re actually working on this at Los Alamos, and are making quite a bit of progress.
The artificial brain might not be that far off...
However, I believe that they''re actually working on this at Los Alamos, and are making quite a bit of progress.
The artificial brain might not be that far off...
quote:
ahw:
the hardware itself should change ... after all, couldn''t we somehow compare transistors as neurons ?? Maybe we didn''t evolve the correct brain
Well, it''s a bit more complicated than that... transistors (and other solid-state electronics) are extremely limited compared to the wet-ware in our heads. Sure, the speed of an electron is hundreds of times faster through a strip of copper than it is across a synapse between two neurons -- but this does not mean that computers are (or can be) necessarily faster than a human brain. There are a few remarkable things about our brains that the current state of computer engineering cannot fully realize.
First, every neuron in our brain (and according to some recent research, the ganglial cells also) act as a seperate processor running in parallel. This is not the case for individual transistors. Each transistor is simply part of a single path of execution for the whole processor. Yes, the technology (and probably even the resources) exists to create a huge array of processors, each acting in parellel and mapping to a certain function of the "brain" that we wanted to recreate... but then comes the next problem.
The neurons in our brain have the ability to dynamically make connections to (one or more) other neurons in our brain. The current state of computer hardware does not allow for this, but it is necessary if we want to be able to escape a hard-wired rule set. Yes, it is possible to emulate this kind of dynamic connectivity through software, but then you''ve still got the hard-wired processor''s rule set to consider. If anyone here read the thread on Emergent Behavior, you''ve seen my view on how to avoid this. Other ways include making our own type of wet-ware (protein-transistors?) or quantum-based (molecular/atomic-transistors?) processor component.
I was about to rant on the current and future state of AI, but then I smacked myself into realizing the topic of this thread. I agree with most people here in that game AI should not try to think, but only mimic the actions that someone (something?) intelligent would do. Doing more is something of a waste of time (waste of processor cycles), since anything more than what is necessary for perceived intelligence is never going to be recognized by the player of the game. The question that gets raised here is what processes are necessary to reach perceived intelligence? And which processes are not required?
"Man is the only animal that laughs and weeps; for he is the only animal that is struck with the difference between what things are and what they ought to be."
--William Hazlitt
Greenspun's Tenth Rule of Programming: "Any sufficiently complicated C or Fortran program contains an ad-hoc, informally-specified bug-ridden slow implementation of half of Common Lisp."
quote: Original post by ahw
first of all, an AI is not supposed to recreate the behaviour of someone or something. An AI is supposed to think (well, we would love to make it do that).
If you really want to put reasonable pressure on what AI should be, then look at what a computer is : an artificial brain.
[snipped for brevity]
Think about it, and you''ll see that in fact, I don''t really thnk we can do any realistic sort of AI before we change the way we design computers. And I am not talking about CRAY becoming a family standard, but rather that the hardware itself should change ... after all, couldn''t we somehow compare transistors as neurons ?? Maybe we didn''t evolve the correct brain
youpla :-P
I think there is room for disagreement here.
What you suggest about "AI is supposed to think" might well be
good for AI research in academia, but in the world of computer
game AI programming, behavior rules, not thinking processes or
mimicing humans.
The whole point of developing computer game AI is to provide the
human player with interesting and entertaining opponents and NPCs that BEHAVE in believable ways. No human player really cares if
the behavior observed occurs because the AI "thinks" properly
like a human. All that is important is that the AI looks (ie.
it behaves) properly.
Eric
Yep, sorry about that, wrong way to say what I am thinking about. In the context of games, we mostly use tricks in order to save CPU, which is why I kinda implied it''s not "real" AI. Academic is probably a much better word
Actually, it''s funny because for my Masters I am gonna have to prove the lecturers that Game AI is as serious as Academia AI ...
void* : sorry, I didn''t think anyone here might have actual knowledge in biology, but then, you can understand even better why I say that we still have a long way to go
It''s funny because I personally prefered studying neurons rather than those damn transistors ...
ah well, go figure !
youpla :-P
Actually, it''s funny because for my Masters I am gonna have to prove the lecturers that Game AI is as serious as Academia AI ...
void* : sorry, I didn''t think anyone here might have actual knowledge in biology, but then, you can understand even better why I say that we still have a long way to go
It''s funny because I personally prefered studying neurons rather than those damn transistors ...
ah well, go figure !
youpla :-P
-----------------------------Sancte Isidore ora pro nobis !
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement