Advertisement

Deeply Theoretical Questions.

Started by January 24, 2002 11:42 AM
9 comments, last by Tricron2 22 years, 9 months ago
When thinking about how human intelligence works, I came up with a few questions that might bare some relevance. (if you don''t want to read much, just skip to the bottom, and I''ve summed up the questions) -Sensory Input- Well, for starters, if a being has no sensory input, can it be intellegent? do Blind people see colors in dreams? do they even see images in dreams? Do deaf people hear audio? So if you had a being that couldn''t see, hear, feel, smell, or taste, would it be intellegent, because it has no input to process. Would intelligence develop? I originally thought no, only instictive behaviors would take place. But with no input, how would instictive behaviors know how to take place? -Applications for Instinct- Which leads nicely into my next topic, the importance of insticts. When a babys born, it immediately starts crying, which means it starts breathing and kicking and moving and crying. Without that, would a child have found the muscles nessesary to move limbs and breath? In think the problem with a lot of AI programs is that they hard code operations the AI can do. The Operation must take place due to a condition, then the AI should analyze the function that did the operation. Then the AI should also be able to find ways to trigger functions on demand (through variable manipulation and function calls). -Importance of Emotion- Lastly what about emotion? Most humans think of AI as a cold artifical calculating menace. However I think AI must have some form of emotion. All emotion does is tell us what is good, and what is bad. If no matter what you did, you would still feel the same, whats your drive to do anything at all? If killing twenty people, eating pizza and having sex as well as everything else did not effect me in any way, why would I preform any operation rather than another? My actions would be quite random. With emotion to dictate whats good and whats bad, AI could then be made to preform many actions better, ie, compose music. To sum up the questions were Do you think an entity with no sensory input could be intelligent? Do you think instincts are important to the development of intellegent beings? Do you think emotion is required to make something seem truly intellegent?
Well, that was a fairly long post, so its a little tough to decide where to reply from first. I guess I''ll start from the beginning. To ask the question whether intelligence can be present with the lack of sensory input, one must first define intelligence. What do you call intelligent behavior? Do you want to say that an "agent" without any sensory input should be able to function like a human being with sensory input? Or do we just want to compare it to some intelligent being who has also been deprived of all sensory input. The official textbook definition of an "intelligent agent" is one that performs according to the input that it is given. So, theoretical, the measure of intelligence is based on how you design the "agent" and what you want it to accomplish. So, an agent without any sensory input or pure input, by default, is intelligent because its performing the best it can with the limited, in the case none, input. So, even if it just runs aimlessly around running into walls, falling over and stuff, it should still be considered intelligent. Think of it this way, if we were to deprive a human being of all their senses, what could they do? Probably nothing. So, intelligent "agents" can exists even without sensory input, that is by definition. But to answer the question that you truly were asking, I would have to conclude that if a self proclaimed intelligent being, us humans, cannot even function under the condition of 0 sensory input, then it is impossible for a human to design an "agent" that would exibit true intelligent behavior under those condition.

As for the question about instincts, as a student in the field of AI, I too believe that instinct should play a major role in developing true intelligence in machines. My conclusion is that everything and every action people take breaks down to survival instincts and a complex compund of the need to survive and other goals. As babies, we start out with pure and simple survival instincts. We cry when we''re hungry. We cry when we get hurt. The only rule there is "survival". Interestingly, as we age and acquire intelligence through experience, the original "primal" instinct seem to become covered up by reason and "logic". But then what do we base these logics on? Why do we do things that make us "happy"? My answer to that is that we do things that make us "happy" because instinctively, the emotion "happiness" is directly tied to survival rate. It is almost like a built in performance function. The better the resulting value of the function, the better the chances of our survival. So, to answer the last 2 questions in one blow, in my opnion, I feel that emotions and instincts are tied together and thet they are the same thing. So, if we are capable of instilling survival instincts into a machine, eventually it would develop some sort of primitive emotions. I know its kind of vague, but hopefully I''ve conveyed it in a reasonable sense.
Advertisement
quote: Do you think an entity with no sensory input could be intelligent?


"Cogito ergo sum"

Descartes held that without sensory input one could still deduce existence from the notion that doubting one''s existence constituted thought and that thinking required an agent and therefore an existence.
"I thought what I'd do was, I'd pretend I was one of those deaf-mutes." - the Laughing Man
quote: Descartes held that without sensory input one could still deduce existence from the notion that doubting one''s existence constituted thought and that thinking required an agent and therefore an existence.


1st, existence has nothing to do with intelligence. You could argue the reverse, but that''s totally off topic. (But, you could argue that one wouldn''t know what existence was if he had no input) Anyway...)

In my thinking of developing intelligence (for small games), I believe intelligence is developed through experiences (input). As for instinct, I don''t know how that is imbedded in the mind, but, in developing intelligence for computers, the base cases we give it act as instinct, and it should learn from it''s experiences after that.

Will

My Gamedev Journal: 2D Game Making, the Easy Way

---(Old Blog, still has good info): 2dGameMaking
-----
"No one ever posts on that message board; it's too crowded." - Yoga Berra (sorta)


Well, I slightly differ from the replies given. Here''s my vision:

1. Without any sensor input one cannot learn. We can only learn if we know that an action is right/wrong, good/bad. If you don''t have any sensors you never have any feedback on our actions, and therefore cannot learn.

2. What we do have without any sensors is instinct. But what you guys call instinct, I call hardcoded behaviour. Lots of behaviour is hardcoded (evolved by evolution), like crying, breathing and reproducing. I therefore think that in some cases you can hardcode a part of the AI on your app, when it is consistent with reality. Insects for example have nothing but ''instinct'', because they do not have a brain and cannot learn anything, even if they have sensory input.

3. Emotion = behaviour, and therefore is defined by 3 things (according to Taine) race, milieu et moment. This means that there is a component hardcoded, a component dependent on how you were raised (environment) and a component dependent on the situation (unpredictable).

Edo

Edo
Just to clarify - I don''t particularly agree with Descartes. I just threw it out there because the slogan is well known, makes a good delimiter to the question, and it comes from a person without whom we could not be having this discussion.

People with limited sensory inputs are capable of intelligence. Helen Keller for example. But it don''t think a baby that couldn''t at least feel wouldn''t last long. How would the child know to eat if it''s belly didn''t ache?

I agree with WierdoFu here.

Although I''m not comfortable with equating emotion and behavior, I think the direction that Edo is heading might bear fruit - but I think a better set of names for the three factors would be instinct, culture and situation. "how you were raised" to me speaks more of culture than it does environment.

Giving the question a little more thought, I think that linguistic capacity is more important to intelligence than emotion. And further, emotion might contradict intelligence all together - I know it interferes with human intelligence far too often. I think emotion would be necessary to make something seem as if it had at least an animal intelligence - more specifically a mammalian intelligence.

- Mike


‘But truth's a menace, science a public danger.’ Brave New World, Aldous Huxley
"I thought what I'd do was, I'd pretend I was one of those deaf-mutes." - the Laughing Man
Advertisement
No, Yes, No

Pretty simple, huh?
No - Intelligence would have never evolved in humans without perception and senses, intelligence would not develop in any human without the ability to use senses as feedback.

No - Separating instinct with reactive behaviour, which are two different things. I think it is possible to develop appropriate reactive behaviour by imitation (or other learning) without any instincts. For mammels however, it has been very important over the millenia.

No - I think emotions hamper intelligence more than anything, to be perfectly honest. But they do give you an overall goal in life.


Artificial Intelligence Depot - Maybe it''s not all about graphics...

Join us in Vienna for the nucl.ai Conference 2015, on July 20-22... Don't miss it!

The topic of how intelligence is manifested within a being is a very interesting topic indeed. Once I was surfing the net for information on the Seti project (I was looking up that one "probability of intelligent life" formula... don''t ask why) and I stumbled across a page with an indepth thesis on the matter. They went over everything you could think of: what is intelligence, how did it evolve, what is required for it to evolve, what other non-human physical constitution could support inteligence? Unfortunately, I lost the URL =-( But I remeber many of the important topics.

Some of the basic requirements for intelligence evolving are:

Sensory Organs- eyes, ears, nose, etc
All beings require some way to input data into themselves if they are to react and even survive in thier environment.

Memory Organs- the brain
What good are input organs if there''s nowhere to store it? A being must be able to retain what it knows. This also allows it to pass on its knowledge to others and its offspring, which expedites the process of becoming intelligent.

Manipulatory Organs- Hands, feet
Limbs that allows the being to move itself and manipulate its environment are also neccessary. Otherwise, it is a passive organism like a plant. It cannot force events to occur, but rather must wait for them to happen, and so such a being will take much longer to evolve intelligence, if it does at all.

Transmission Organs- mouth, ears
All intelligent beings require some way to send and receive data from their kin. Doing so also expedites the evolution of intelligence. The better their language is for communicating thought, the more potential they have to learn.

Defensive Organs-
The universe itself is one of the most hostile things to life. Be it astrological dangers like meteors or competition with other life forms, all beings must have some defense vs nature if they are to live long enough to evolve.

I may be missing some things, but oh well... it was a while ago I read this.

However you will notice that they article did not mention emotion. An emotion is more like a reaction to data the being collects. So when programming your AI, it''s up to you what emotions (or reactions) it will have to certain situations. You probably could have the AI that doesn''t like the sight of blood and will always try to evade you instead of confronting you, but then it may make the game a bit irratating trying to fight "wussy AI".

I also do not recall much on the subject of instinct either. All beings need instincts; they are basic instructions and impulses that allow that being to survive. Instincts may dictate reactions a being might have to a certain degree, but those reactions can often be "overrided" (conquering your fears, for example). This is why we often like to add a certain luck factor into AI''s decision making. It simply makes AI a bit more realistic.


After reading all that, though, I click a link that led me to a discussion of what is a "self". At that point, things got really freaky (think AI-will-take-over-the-world scary things). I really wish I could find the site, but after thourough checking I could not relocate it (the equasion I was looking is called the Drake equasion; it''s a bunch of unknown variables that show the probability of ET''s, but has no actual application besides it looks cool).

"I''''m not evil... I just use the power of good in evil ways."
Conciousness / Intelligence

A Computer can never be truely intelligent because of conciousness (Kind of Weak AI view)

Belief that computers will be truely intelligent one day, and will be able to do everything as a human can. (Kind of Turing Test approach / Strong AI)

But how are conciousness and Intelligence interlinked? We never compare the intelligence of a computer with that of an insect, why? I would imagine, that if we had a Turing test for a computer and a worm, we would probably not have a clue which was which.

I think that to make a computer Intelligent, it really depends on things like the Turing Test, in order to fool a human that a computer is also a human. But even if it succeeded, would it really be intelligent if it still wasn''t concious?


Hmmmm, I wonder?

Bye

This topic is closed to new replies.

Advertisement