Are you a Cosmist or a Terran?
I would like to find out if you are a Cosmist or a Terran, based on this article I found:
http://www.cs.usu.edu/~degaris/artilectwar2.html
=================================
I think the article raises some very interesting questions not about if we CAN create AI....but rather SHOULD we create AI...and what are the consequences of setting such creatures lose?
I am not sure how ready I am to hand over some things to true AI beings. Would you want them operating on you? Would they "care" about your fate like a real doctor (hopefully) would?
How would AI beings see us?
As friends?
As enemies?
As bugs to be squashed (in other words...indifferent to our fate or if we die or not)?
Can we truly call something artificially intelligent if we don't trust it enough to set it free and let it explore and do what it wants to do? Can we say...be smart...be logical....but only as long as you do what I want?
Presumably, if we can create intelligence, then we can teach it social and personal ethics and morality and permit it to explore those. Sure, this could lead to some nasty outcomes, but they wouldn't be much worse than the sorts of severe perturbations from the 'common good' that we see around the planet every day: genocide in the Balkans or Africa, inhumane treatment of our fellow humans, war, politics, etc., etc.
Is it likely that robots will take over the world and enslave or annihilate humans, a la The Matrix and Terminator? I doubt it.
Timkin
Is it likely that robots will take over the world and enslave or annihilate humans, a la The Matrix and Terminator? I doubt it.
Timkin
The essayist himself concedes that he is not 100% Cosmist and that he grapples with the ethical implications of his work, so the question may not be an either or proposition. I dislike that the author felt he had to make up a new word, "gigadeath", to describe human extinction. This suggests to me a profound lack of human decency in his work - despite his claims to the contrary. A technical language that obscures the reality of what it pretends to discuss lends itself well to the dissolution of ordinary decency and thus, in this case, to the horrors it supposedly warns against.
Anyway, to answer the question, I'm a terran and have been since I first read one of Moravec's books 15 years ago. Just the same, de Garis essay is interesting and worth the read.
Here's the url, linkified:
"The Artilect War", Second Version, 2001
Anyway, to answer the question, I'm a terran and have been since I first read one of Moravec's books 15 years ago. Just the same, de Garis essay is interesting and worth the read.
Here's the url, linkified:
"The Artilect War", Second Version, 2001
"I thought what I'd do was, I'd pretend I was one of those deaf-mutes." - the Laughing Man
Of course we want to create AIs. Who else would run the General Systems Vehicles while we humans frolic without worries?
enum Bool { True, False, FileNotFound };
Cosmist, I.
I can't help but see the creation of machine intelligence as an augment for human intelligence as the next evolutionary step for humans. If the technology becomes available, I would be quite happy to become a cyborg, and/or transition completely to machine when my human body is too old and decrepit to be fun.
On further reflection, I can imagine that Terrans would be far more in danger from cyborgs than from pure machine intelligences. Machine intelligence, according to my intuition, will most likely be extremely abstract and not significantly concerned with humans except where they affect its current task (ie. an intelligent traffic routing system would need to know statistical information about the travel plans of humans with respect to time, but it wouldn't really care what the humans had for lunch). In comparison, cyborg intelligences will be human, with millions of years of kill-or-be-killed evolution shaping their thoughts, but with the addition of a tremendous amount of intelligence. As such, while the aforementioned traffic routing computer would not be seen by Terrans as a threat, the cyborg would be seen as such, and furthermore would be equipped to respond in kind if Terrans did attempt to destroy it.
I can't help but see the creation of machine intelligence as an augment for human intelligence as the next evolutionary step for humans. If the technology becomes available, I would be quite happy to become a cyborg, and/or transition completely to machine when my human body is too old and decrepit to be fun.
On further reflection, I can imagine that Terrans would be far more in danger from cyborgs than from pure machine intelligences. Machine intelligence, according to my intuition, will most likely be extremely abstract and not significantly concerned with humans except where they affect its current task (ie. an intelligent traffic routing system would need to know statistical information about the travel plans of humans with respect to time, but it wouldn't really care what the humans had for lunch). In comparison, cyborg intelligences will be human, with millions of years of kill-or-be-killed evolution shaping their thoughts, but with the addition of a tremendous amount of intelligence. As such, while the aforementioned traffic routing computer would not be seen by Terrans as a threat, the cyborg would be seen as such, and furthermore would be equipped to respond in kind if Terrans did attempt to destroy it.
Quote: Let me try to express this Terran revulsion against the cyborgs in an even more graphic way that may have a stronger appeal to women than to men. Take the case of a young mother who has just given birth. She decides to convert her baby into a cyborg, by adding the "grain of sugar" to her baby's brain, thus transforming her baby into a human faced artilect. Her "baby" will now spend only about a trillionth of its mental capacity thinking human thoughts, and the rest of its brain capacity (i.e. 99.9999999999% of it) will be used for thinking artilect thoughts (whatever they are). In effect, the mother has "killed" her baby because it is no longer human. It is an "artilect in human disguise" and totally alien to her.I think here the author is misunderstanding the nature of mental augments, as I envision them. The purpose of such an augment is not to be a 'second brain' taking over the host, but to add processing power to the host's own mental system. Hence, while 99.999999999% of the baby's brain power will indeed come from the implant, 100% of its (augmented) brain capacity will be thinking 'human thoughts', but astronomically more thoroughly and at a greater rate than any merely human baby.
Cosmist.
And I think he doesn't dismiss the IA people (intelligence amplification, or the "cyborg" faction he counts under the cosmist faction), but just thinks that if most your your sense data comes from sources other than your body that you really aren't that human anymore. Your "self" is elsewhere. (if you even have one, single self)
I really would like to know if anyone here is Terran.
And I think he doesn't dismiss the IA people (intelligence amplification, or the "cyborg" faction he counts under the cosmist faction), but just thinks that if most your your sense data comes from sources other than your body that you really aren't that human anymore. Your "self" is elsewhere. (if you even have one, single self)
I really would like to know if anyone here is Terran.
Quote: Original post by C-Junkie
I really would like to know if anyone here is Terran.
I said I was. I'm still reading the entire essay. I wouldn't advocate killing off cosmists though. I find the idea that Terrans would go so far as to attempt to wipe out Cosmists rather farfetched. Nuclear weapons could lead to human extinction as well and there hasn't been a movement to exterminate nuclear physicists.
"I thought what I'd do was, I'd pretend I was one of those deaf-mutes." - the Laughing Man
Quick test:
URL: "The Artilect War", Second Version, 2001
=============
Moravec's book was titled???
Quote:
URL: "The Artilect War", Second Version, 2001
URL: "The Artilect War", Second Version, 2001
=============
Moravec's book was titled???
Quote: Original post by LessBreadIt's a simple technique. Same with the word "singularity." All it is is a nifty way of observing the influence of your work. If you see people talking about "gigadeath" you know you're the one that started this discussion, rather than some other source.
I dislike that the author felt he had to make up a new word, "gigadeath", to describe human extinction. This suggests to me a profound lack of human decency in his work - despite his claims to the contrary.
Quote: Anyway, to answer the question, I'm a terran and have been since I first read one of Moravec's books 15 years ago. Just the same, de Garis essay is interesting and worth the read.I'm going to have to figure out how Moravec is.
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement