Original post by Anonymous PosterQuote: Original post by ilavosQuote: See previous page [\quote]
You're kind of missing the point here.In case you forgot,we we discussing my definition of intellegence.In it i said that it deipends on past experience and current knowledge.In your example of a person born blind the past experience and the current knowledge are different,which is why he cannot point out to a color or a picture for that matter.But the thing is this case is similar to Tarzan seeing a car!HE JUST NEVER SAW IT BEFORE.This is in no way linked to consciousness.You see all you did in your post is talk about past experence,over which we never where in conflict.Thus you're deaf person learning music is also a bad example in this case since.....the past experience is different.
I never said instinct was intellegence,I just said that consciousness and intelligence aren't related.
i think you've missed my point. i have tried to stress just one single point throughout all my posts - it's that experience is what makes consciousness, and without consciousness, there is not such thing as experience. no experience means no knowledge, and no knowledge means no intelligence. your tarzan example just showed that you didn't understand the subtle differences. but no matter, this isn't called the "hard" problem for nothing.
Bravo.Aside taking what i was saying and twisting it,I really dont see what you did in this post.If indeed from the start you intended to say that past experience and consciousness were related this way,we would'nt have gotten so far in such a futile debate.Make your points clear.
Quote: no experience means no knowledge, and no knowledge means no intelligence.
Honest,read the previous posts and please be kind enough to remind me my definition.You're good.Did I hear copyright?? ;-)
Quote: this isn't called the "hard" problem for nothing
How very true.You should think about this.
What is intelligence?
Quote: Original post by Anonymous Poster
btw, your tarzan example is wrong because it's not just past experience that matters it _the ability to have experience_ that matters, i.e. consciousness. something which your definition must assume to begin with but also which something you have completely ignored.
Nice comeback.I like the way you twist things around.Are you a lawyer??
Original post by Anonymous PosterQuote: and just to be sure you don't mistakenly read it as me condesending to you (that's not my intention): it's called the "hard" problem because historically, this problem has eluded many people even in the field of the philosophy of mind.
I agree.Don't worry no offence taken.You say some pretty wise things at times.I really think you should listen to yourself more ;-)
Quote: people seem to just take the ability to experience, for example the red colour (as oppose to just responding to 650 nm wavelength of light without even knowning what you responded to) for granted and never even know that there is something more to just the wavelength when we say something is "red". but this very experience that forms our basic impression which allows us to form more complex ones like conceiving something as a car (here you see why your tarzan example were off the mark) that makes knowledge - which is the very basis of intelligence.
Ok.You know your grade 11 physics.Thats good.But like I said,you're just repeating what i was saying.Obviously you know a lot and intend on letting the whole world know.But dont let your ego get in the way.Honest tips.One shouldn't always believe that what he thinks is what everyone should think.Your main objective was visibly to break me down,and look at the result: your arguments just mysteriously started sounding like mine ;-)
This is geting out of hand,so reply if you will,this is my last post to you.
Peace
ps:I really think you should read Timkin's post.
[Edited by - ilavos on August 6, 2004 2:33:28 PM]
August 06, 2004 09:13 PM
ilavos:
hm... i assume you agree to make peace from your last post, so i won't be replying those 3 posts you made but instead reply only timkin's.
timkin:
i understand you point, at least i think i do. i used the word experience to mean just that - not past experience with past empathized. i mean, chemical processes such as oxygen binding together do not create past "experience" per se, if they happen as a matter of their nature, it's just natural laws, and every instances of it be just instances, not really as an experience. to use the word experience is already to assume whatever that has it, is some form of an observer and thus must be conscious - it might no longer be conscious now; it might not be conscious just before that, but given experience is not transmutable, then whatever that has the past experience must have the experience (i guess may be i should just switch to use "qualia") to qualify, regardless of time. this mirrors your second definition, but i don't see why we must pretend that we don't use the word experience always as a loaded word (experiences is a record or memory of specific quale, after all), regardless of conjoining it with chronological terms espeically since i have made myself clear with the multiple examples. however, i must ask you what do you mean by agreeing conditionally?
and what would you say to my question on bugs? let me rephrase the question however: is a program with a bug (obviously bad enough to produce unintended result) as intelligent as the same program that is bug free? my qualm with the whole ai thing is that people in this field seems to always like to propogate intelligence as a synonym of complexity. and by propogate, i mean they advertise it to make money with. the fact is intelligent has always been contributed to things that are conscious, not just to anything by its mere complexity. ai is considered "intelligent" because it acts like people, intelligent people to be sure. however, it is not the mere method/algorithm, which is what the ai is, that we consider intelligent, but the person who came up with the method/algorithm. i still remember a sort of joke from my stats prof (he's not the author obviously, but i don't know who is): given enough monkeys enough time with enough typewriters, by chance alone one of them would type up a shakespeares' play. the point of course, as you all know, is not meant to show that you'll find an intelligence monkey if you wait long enough (then again, may be it can be used for evolution... anyway). regardless the original intent of the story however, it should be obvious that sharespeare, if he has a type writer, typing his play is not the same as a random monkey typing the same play by chance, even though the physical process involved, the typing action to be precise, is exactly the same. and intelligent origin cannot be infer from just anything that simply seemed orderly or complex, ie. the results we get from ai systems (hume was the one who made that argument, not on ai though, but on the problem of intelligent design, for those who are interested). and let's face it, a lot of those same people use the word intelligence only hoping others to think that the programs they are offering will act like real people rather than just a complex system. this is why i feel that it should be made clear, which i did in my first post, that neither pocession of speed nor complexity alone is adequate for a system to be considered intelligent. rather, it is the ability to actually define problems (the reason to actully make speed and complexity relevant - and i think we'd all agree that the whole notion of relvance is not a simple physical process) that makes the intelligence. and this, actually a very vague notion, is part of the "hard problem" of consciousness.
and i did put it that vaguely precisely to avoid our anthropocentric bias. that is, as long as that something, whatever it is, came up with the problem and hopefully be able to solve it itself, then even if that something is a thermometer (i did raise this point in one of my posts in the line), whose consciousness cannot be appreciated by us human, must still be considered intelligent - which i believe should alleviate moral concerns you've raised. but of course, the question is how do we even know what has consciousness and what has not? and that is why i concluded that we can only know what is intelligence when we know what is consciousness.
hm... i assume you agree to make peace from your last post, so i won't be replying those 3 posts you made but instead reply only timkin's.
timkin:
i understand you point, at least i think i do. i used the word experience to mean just that - not past experience with past empathized. i mean, chemical processes such as oxygen binding together do not create past "experience" per se, if they happen as a matter of their nature, it's just natural laws, and every instances of it be just instances, not really as an experience. to use the word experience is already to assume whatever that has it, is some form of an observer and thus must be conscious - it might no longer be conscious now; it might not be conscious just before that, but given experience is not transmutable, then whatever that has the past experience must have the experience (i guess may be i should just switch to use "qualia") to qualify, regardless of time. this mirrors your second definition, but i don't see why we must pretend that we don't use the word experience always as a loaded word (experiences is a record or memory of specific quale, after all), regardless of conjoining it with chronological terms espeically since i have made myself clear with the multiple examples. however, i must ask you what do you mean by agreeing conditionally?
and what would you say to my question on bugs? let me rephrase the question however: is a program with a bug (obviously bad enough to produce unintended result) as intelligent as the same program that is bug free? my qualm with the whole ai thing is that people in this field seems to always like to propogate intelligence as a synonym of complexity. and by propogate, i mean they advertise it to make money with. the fact is intelligent has always been contributed to things that are conscious, not just to anything by its mere complexity. ai is considered "intelligent" because it acts like people, intelligent people to be sure. however, it is not the mere method/algorithm, which is what the ai is, that we consider intelligent, but the person who came up with the method/algorithm. i still remember a sort of joke from my stats prof (he's not the author obviously, but i don't know who is): given enough monkeys enough time with enough typewriters, by chance alone one of them would type up a shakespeares' play. the point of course, as you all know, is not meant to show that you'll find an intelligence monkey if you wait long enough (then again, may be it can be used for evolution... anyway). regardless the original intent of the story however, it should be obvious that sharespeare, if he has a type writer, typing his play is not the same as a random monkey typing the same play by chance, even though the physical process involved, the typing action to be precise, is exactly the same. and intelligent origin cannot be infer from just anything that simply seemed orderly or complex, ie. the results we get from ai systems (hume was the one who made that argument, not on ai though, but on the problem of intelligent design, for those who are interested). and let's face it, a lot of those same people use the word intelligence only hoping others to think that the programs they are offering will act like real people rather than just a complex system. this is why i feel that it should be made clear, which i did in my first post, that neither pocession of speed nor complexity alone is adequate for a system to be considered intelligent. rather, it is the ability to actually define problems (the reason to actully make speed and complexity relevant - and i think we'd all agree that the whole notion of relvance is not a simple physical process) that makes the intelligence. and this, actually a very vague notion, is part of the "hard problem" of consciousness.
and i did put it that vaguely precisely to avoid our anthropocentric bias. that is, as long as that something, whatever it is, came up with the problem and hopefully be able to solve it itself, then even if that something is a thermometer (i did raise this point in one of my posts in the line), whose consciousness cannot be appreciated by us human, must still be considered intelligent - which i believe should alleviate moral concerns you've raised. but of course, the question is how do we even know what has consciousness and what has not? and that is why i concluded that we can only know what is intelligence when we know what is consciousness.
August 06, 2004 10:47 PM
actually the whole contention can be summed as this (we all know consciousness is an ass to write about, so let me try to cut it short):
i cannot find anything we dimmed intelligent that aren't:
1) either conscious themselves, or
2) some sort of physical implementation of solutions to a conscious being's defined problem, by conscious beings.
this is the missing component as i see it. if anyone can just name one thing that is intelligent and yet not follow any of the above two principles, then i would have no problem agreeing (i can work out the rest of the reasoning). but as of now, i will be lying if i were to say i agree with the seperation between consciousness and intelligence.
i cannot find anything we dimmed intelligent that aren't:
1) either conscious themselves, or
2) some sort of physical implementation of solutions to a conscious being's defined problem, by conscious beings.
this is the missing component as i see it. if anyone can just name one thing that is intelligent and yet not follow any of the above two principles, then i would have no problem agreeing (i can work out the rest of the reasoning). but as of now, i will be lying if i were to say i agree with the seperation between consciousness and intelligence.
August 06, 2004 11:29 PM
hm i should may be register or something so i can add stuff...
just making sure it's clear, ai is included in 2) but the problem is that it is the legacy of conscious thoughts, and the intelligence can be traced back to the conscious thoughts, which means it does not reside in the physical processes that was exploited by conscious beings. so although i said "we dimmed it intelligent", what it is is really just an intelligent design (the physical process being exploited is never included in the complement). so, anyway, just give me an example of a thing that has intelligence but doesn't involve those two principles, then you can be sure my comprehension of the seperation will be ascertained. (i'm trying hard to make sure that doesn't sound like a challenge. if there are really stuff like that that i didn't realize, i feel it isn't too much to ask for to be told what they are for having spent the effort constructing my counter arguement)
just making sure it's clear, ai is included in 2) but the problem is that it is the legacy of conscious thoughts, and the intelligence can be traced back to the conscious thoughts, which means it does not reside in the physical processes that was exploited by conscious beings. so although i said "we dimmed it intelligent", what it is is really just an intelligent design (the physical process being exploited is never included in the complement). so, anyway, just give me an example of a thing that has intelligence but doesn't involve those two principles, then you can be sure my comprehension of the seperation will be ascertained. (i'm trying hard to make sure that doesn't sound like a challenge. if there are really stuff like that that i didn't realize, i feel it isn't too much to ask for to be told what they are for having spent the effort constructing my counter arguement)
Quote: Original post by Anonymous Poster
if they happen as a matter of their nature, it's just natural laws, and every instances of it be just instances, not really as an experience.
There's no evidence that 'experience' is anything but natural processes. Let's certainly NOT starting postulating souls to explain 'experience'.
Quote: Original post by Anonymous Poster
but given experience is not transmutable
There's also no evidence to show that this is true either. Certainly, we don't yet know HOW to do it, but that doesn't mean it cannot be done. Once we've completely decoded the functionality of the brain and its relationship to perception, THEN we could answer that question. Before that time, it is only conjecture.
Quote: Original post by Anonymous Poster
(experiences is a record or memory of specific quale, after all),
Memories certainly don't contain the qualia of the original event. For example, remembering a red object does not convey to you the same information about the object as seeing that red object in the first place. You have simply formed a compartmentalised internal model of the object (possibly based on the components of qualia at the time of observation) and are now reassembling the model given some recall trigger (this is a rough description of one of the current models of memory in the human brain). That is, you know roughly what 'red' looks like and you have a model of the shape of the object and you merge these together to form the memory when you recall it.
Quote: Original post by Anonymous Poster
however, i must ask you what do you mean by agreeing conditionally?
I don't agree completely with perception of qualia as necessary for intelligence because I don't believe that we yet fully understand what qualia is, yet I believe we have perfectly reasonable working definitions of what intelligence is.
Quote: Original post by Anonymous Poster
and what would you say to my question on bugs?
I'll need to think about that some more before I offer an opinion.
Quote: Original post by Anonymous Poster
the fact is intelligent has always been contributed to things that are conscious
That's definitely not true. Certainly, in the recent past the predominant view was that only intelligent beings were conscious (as I described above as a motivating factor for much of the modern era of industrialisation), however that doesn't mean that only conscious beings are intelligent. Sadly the Catholic church has a lot to answer for on the matter of intelligence-consciousness, but I won't go into that here, because we're not having a religous debate.
To paraphrase your final point, I believe you are saying that intelligence is more than just intelligent programming or a process that produce an apparently intelligent result. I believe your saying that the result is only intelligent to someone observing it and only if they know the process was intelligent (and not random). I agree with this in part.
I don't agree that an observer is necessary for intelligence to exist, however I do believe that the only way we can identify intelligence in other agents is by observing their behaviours and asking 'what would we do in that situation?'
Certainly, I could encode my intelligence in a program and set it to work on a task. You would argue that there is no intelligence in the program. I would argue though that I have managed to imbue my computer with a limited proportion of MY intelligence, making that computer intelligent (at least in a small manner). I beleive intelligence can be shared and/or created. If I didn't, I wouldn't waste my life researching artificial intelligence!
Furthermore, if I wrote a program that could observe its world and generate its own solutions to a problem and then test them for quality, then I would say that program is intelligent. A implemented genetic algorithm is one such program, even though it's world may be a constrained mathematical construct.
Well, enough of my ramblings... before this thread continues, I urge everyone partaking to remember that this is a forum for discussing Game AI (AI related to game development). *Timkin puts his mod hat on* While it is often important to relate such issues back to larger issues with AI and intelligence, I want us all to remember what we're here for (and what the bandwidth is really for). I'm happy for this thread to remain open and continue, so long as it stay within the realms of a discussion of AI/intelligence (consciousness is a very tenuous branch).
*mod hat off*
Cheers all,
Timkin
Quote: Original post by Timkin
Memories certainly don't contain the qualia of the original event.
How can you be so sure given that you "don't believe that we yet fully understand what qualia is"? I tend to agree with you of course, but maybe for different reasons.
Quote: Original post by Timkin
For example, remembering a red object does not convey to you the same information about the object as seeing that red object in the first place.
Are you saying that remembering conveys LESS information than the original event (i.e. memories are less vivid) or that remembering conveys DIFFERENT information than the original event (i.e. memories are inaccurate), or both?
August 12, 2004 11:32 PM
Original post by gooddoggyQuote: Original post by Timkin
Memories certainly don't contain the qualia of the original event.
How can you be so sure given that you "don't believe that we yet fully understand what qualia is"? I tend to agree with you of course, but maybe for different reasons.
It may be that memories contain some other qualia, to do with the 'experience' of having a memory... although I'm not convinced by even this. By definition though, qualia is tied into the event, so a memory of an event could encode a memory of the qualia, but not the qualia itself, since that belonged to the event. Make sense? Personally, I believe that qualia is just a byproduct of perception and cognition and not something that actually exists in the event. Perhaps, indeed, qualia ONLY exists in our memories and thoughts and not in the event itself!Quote: Original post by Timkin
For example, remembering a red object does not convey to you the same information about the object as seeing that red object in the first place.Quote: Original post by gooddoggy
Are you saying that remembering conveys LESS information than the original event
I'm certainly saying that memories convey LESS information ABOUT the original event (we all have a habit of adding in things that weren't there, which conveys (irrelevant/erroneous) information.Quote: Original post by gooddoggy
(i.e. memories are less vivid)
I don't know that I want to go that far. People certainly have very vivid memories of events that they only experience momentarily. Car accidents are a good example. The question arises though, was the original event vivid at the time, or did the brain store a lot more information than our focus noted at the time.Quote: Original post by gooddoggy
or that remembering conveys DIFFERENT information than the original event (i.e. memories are inaccurate)
There is certainly enough research data to suggest that memories don't match the actual events that were experienced and that models of memory that suggest reconstruction of the event from component memories are quite likely. Given such a model, it is easy to see how a memory could be corrupted by incorrectly using the wrong component to recreate the model. For example you see a green tree and a red car and note that the car was a different colour to the tree. Later though, your brain tries to pull out the colours of the objects but cannot recall the cars colour. Knowing that it was different from the tree, your brain substitutes an answer (which answer, I don't know... maybe the most common car colour, or the expected colour, or maybe a colourless can with horns because you were dreaming! ;) ).
Hope this helps clarify my stance.
Cheers,
Timkin
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement