Advertisement

Artificial Intelligence - Environment Interaction

Started by October 15, 2005 09:28 PM
0 comments, last by xEricx 19 years, 1 month ago
Hello, Lenox here. I was idling in IRC today and I saw someone mention this idea they were thinking about: Creating AI that starts out without any knowledge of their environment, then as they go they learn more about their surroundings and become more capable of interacting with them. Well, this made me ask myself ( and this is also what I'm currently asking you ), "How could I go about doing this?" So, to sum it all up: * I'd like to make some AI that can learn about and interact with it's environment, so does anyone have any good resources on this?
Hummm, first of all, are we speaking about real world (robotics, ex.: lego mindstorms) or a virtual world, that you would build yourself?

For the real world, I have no clue, but I guess the subject is covered in depth by tons of papers from the robotics world.

For a virtual world, well, you are the one defining what's doable with the environment. So I guess you could build a "knowledge database" for your AI where he can insert results of its trial and errors. Something that would allow it to store a [current object state]->[action]->[new object state].

Now, giving your AI goals, he could mess with the environment until he's got a valid chain of actions to perform a goal.

Also, were you thinking about implementing this for a game? or just for an AI simulation?

I hope this contained useful information, have fun implementing it :)

Cheers

Eric

This topic is closed to new replies.

Advertisement