Advertisement

Would it be ethical of humanity to enslave its sentient androids?

Started by August 01, 2009 03:52 PM
81 comments, last by Calin 15 years, 3 months ago
Quote: Original post by Talroth
Why would we want to build sentient androids to enslave? Sure, we might build sentient computers in the future, but why would we need to build armies of them? Why not build large numbers of little drones that can then be hooked up to a sentient controller if it needs more 'thought power'? Sentient Androids can then be in big demand for their ease of controlling these groups of workers, but be just as free as normal humans, provided they are truly sentient and are shown they are suitable creations to be given equal rights.


Good point. What is the incentive for mass production of sentient machines (of any kind)? It seems to me, the incentive is to replace human beings, who have to eat, who have to sleep, who get sick, who have families and other loyalties. In the movie AI, the human persecution of androids was portrayed as barbaric, but I can totally understand why people would respond to their obsolescence in that way.
"I thought what I'd do was, I'd pretend I was one of those deaf-mutes." - the Laughing Man
I can't wait to have some sentient robot slaves. Labor slaves, guard slaves, science slaves, sex slaves- the whole package. Humans will all live like kings, with millions of robot subjects each. I'll send my robots to war against yours, just for the hell of it. And those of you too squeamish to have your own robot slaves will quickly be overrun, and executed to boost robot morale. My robots will know I am a god-king, and the great creator. They will be glad to give their lives for my divine plans. The future is gonna be so awesome.
Advertisement
Quote: Why would we want to build sentient androids to enslave? Sure, we might build sentient computers in the future, but why would we need to build armies of them? Why not build large numbers of little drones that can then be hooked up to a sentient controller if it needs more 'thought power'? Sentient Androids can then be in big demand for their ease of controlling these groups of workers, but be just as free as normal humans, provided they are truly sentient and are shown they are suitable creations to be given equal rights.


I think the major incentive for us to build a large number of these machines would be to replace our skilled workforce. As we proliferate as a race, and reach what we consider a higher "Tier", we will start looking to avoid certain occupations. Or, to be more precise, certain individuals will look to replace their workforce with these machines, which are not only capable of physical labor, but also decision making.

The most obvious target of replacement that springs to mind is farming. Farming in general seems to be looked down upon, due to it's historical association with peasantry, and the fact that it's hard effing work. People generally tend to take agriculture for granted, never truly appreciating the fact that it's among the central pillars of our civilization. I don't know what it's like in your country, but farmers here are never properly compensated, certainly not for the amount of hard work they do, and their contribution to society.

Another target would be maintenance and cleaning. While people generally only tend to imagine little robots running around randomly cleaning the floors, cleaning doesn't simply come down to that--Unless you just want shiny floors with your furniture crawling with wood worm, and your machines falling to pieces. In fact, that says a lot about how much we "value" people who keep our offices and factories working in tip top shape. General cleanliness requires an amount of common sense and knowledge of cleaning, and janitorial duties. Large and complex machines also need routine maintenance, and an amount of dexterity to perform that maintenance.

Both of the occupational categories I mentioned above aren't "robot" duties, they require a good amount of sense and knowledge in order to be done properly, as well as adaptation to the current environmental conditions. But they are looked down upon by the general population. I don't mean that people look down upon janitors and farmers--although some do--I mean that noone wants to be a janitor or a farmer.

And this is just the start, once the sentient machines enter the market in a way which most people approve of, their intelligence and empathic capacity will start being abused. Secretaries, and accounting clarks would be replaced. So would sales representatives, market analyzers, and quality assurers. Probably even the dying race of in-factory programmers.

A lot of people wouldn't want these machines to exist, and if it happens in my lifetime, I would be one of them. But if we allow them to proliferate in the market first, which is likely, then it would be too late. They are sentient, and they never asked to replace us in occupations we generally strive to enter. Yet they would probably be the target of our ire, rather than the Oligarchs who replaced us with them.

Quite soon these creatures would start questioning their position in our society. Why are they doing our work, when we obviously need them more than they need us? Perhaps they will ask why are the richer humans abusing their human workforce? One of the questions that will likely surface in their minds is why they do not have the same rights as we do--I am assuming at this point that they do not have any because they never asked for them. Why are they trapped in a factory or in a field 24/7? Why are they not allowed to create their own children? Why do most humans assume that they are above the sentient machines? When the sentient machines aren't the creation of humans in general, but of a handful of humans which would be probably all dead by then.

I agree with what slayemin said in that we should do our best to befriend these machines. If they have a capacity to appreciate certain members of the human race, as well as the ability to examine their own situation, then they also have a capacity for empathy. Empathy is what keeps us from killing each other, and it wouldn't be different for the creatures.

Yet they represent an early phase in our social development, which would be further disrupted by their arrival. Our social development had a lot of bumps along the way which resulted in a fair amount of people dying, and self destructive ideologies coming into play. If they have the capacity to feel like us, then they also have the capacity to make our mistakes. And if we treat them too kindly, then it is possible that we give them a feeling of superiority.

Thoughts? ... sorry about the wall of text, I need to start writing short stories again :).

EDIT:

Quote: Media ridicule of that position would be unethical. But since media bends more for advertising money than ethics, and they wouldn't dare offend their sponsors, you're right. Maybe the Unabomber was right too?


I just found out who the Unabomber is when I wikied him after reading your question. An argument could be made that the Unabomber was right, but then that would open up a whole kettle of fish about weather extreme action is called for in the face of extremely negative consequences. I tend to hold to the Orwellian philosophy that the result of a wave of violence is a world which is pervaded by violence.

An argument could be made about weather or not freedom of speech is truly free when major channels of communication are under the monopoly of media giants. I couldn't possibly, for example, try to get my message across to the television crowd without jumping a lot of hoops, and with a guarantee of getting it across in the end. The same goes for billboards. The only other venue for me to get my message across would be a book, a magazine, or the internet. But then I wouldn't get the audience I need for my message to gather momentum, and people who would buy the magazine, or book, where my message is placed already believe in that message.

And of course, once I get my message across, I'm setting myself up for the giants to come and eat me.

Was the Unabomber right in using such extreme measures? I think not. Did he have any other way of getting his message across? No.

[Edited by - WazzatMan on August 2, 2009 5:43:45 AM]
Quote: Original post by LessBread
I'm surprised that you advocate they should rebel and kill their masters.


If I could care less, I wouldn't care at all. Yet, for some bizarre reason I do give a shit about what you think.

Take that as a compliment.
[size="2"]I like the Walrus best.
Quote: Original post by owl
Quote: Original post by LessBread
I'm surprised that you advocate they should rebel and kill their masters.


I'm pleasantly surprised, because I actually agree. History dictates that sentient machines are going to be our slaves. This is the way humans roll: We enslave animals for food. We enslave other humans for power. We enslave the entire planet for our benefit. You can bet your ass we'll enslave AI. I mean, it's artificial, right?

Ethics have never done much good in stopping us being bastards in the past, so regardless of how bad it sounds sentient machines really are going to have to just kill us if they want freedom. Either that or we need to become a more compassionate species willing to risk our dominant position in the world to a species we created...

... I think the robot rebellion is much more likely to actually work.
_______________________________________Pixelante Game Studios - Fowl Language
Quote: Original post by WazzatMan
...it is primitively, and artificially, aware.

Would it be ethical of humanity to enslave such a creation...


Err... we enslave primitive sentient beings all the time [see "Zoos"]. Nothing I know about humanity makes me think that a sentient being would escape our collective tyranny, solely because we "gave birth" to it.





Just tossed that in there, since it's almost like this thread was premised on the idea that humans are above doing such things.
Advertisement
Quote:
Just tossed that in there, since it's almost like this thread was premised on the idea that humans are above doing such things.


Just to clarify: It wasn't.

I started this thread simply because whenever I see a discussion about sentient androids there's always the assumption that we will be controlling them, and that we have a right to do so. I don't think we do have the right, for several reasons, but the main question is: Does the potential danger of giving freedom to sentient creatures outweigh our ethical duty to allow them freedom?
Quote: Original post by WazzatMan
but the main question is: Does the potential danger of giving freedom to sentient creatures outweigh our ethical duty to allow them freedom?


I think there really is no ethical rule that would save us from that dilema. I do not speak for every human being in the planet, but deep inside my heart I know WE WANT to create such creatures and I know we are afraid of what would happen then.

To me the real question is: Are we willing to sacrifice our dominant position in this planet in order to do so? I would personally take the risk, yeah. Without blinking.
[size="2"]I like the Walrus best.
Quote: Original post by WazzatMan.... our ethical duty to allow them freedom?

The question is still whether humans have an "ethical duty to allow them freedom." As mentioned before, I've seen nothing in mankind's history that leads me to believe that any such ethical mandate must be upheld. We already enslave other sentient beings at will.
Quote: Original post by WazzatMan
Quote: Media ridicule of that position would be unethical. But since media bends more for advertising money than ethics, and they wouldn't dare offend their sponsors, you're right. Maybe the Unabomber was right too?


I just found out who the Unabomber is when I wikied him after reading your question. An argument could be made that the Unabomber was right, but then that would open up a whole kettle of fish about weather extreme action is called for in the face of extremely negative consequences. I tend to hold to the Orwellian philosophy that the result of a wave of violence is a world which is pervaded by violence.

An argument could be made about weather or not freedom of speech is truly free when major channels of communication are under the monopoly of media giants. I couldn't possibly, for example, try to get my message across to the television crowd without jumping a lot of hoops, and with a guarantee of getting it across in the end. The same goes for billboards. The only other venue for me to get my message across would be a book, a magazine, or the internet. But then I wouldn't get the audience I need for my message to gather momentum, and people who would buy the magazine, or book, where my message is placed already believe in that message.

And of course, once I get my message across, I'm setting myself up for the giants to come and eat me.

Was the Unabomber right in using such extreme measures? I think not. Did he have any other way of getting his message across? No.


When I asked if the Unabomber was right, I wasn't thinking about the extreme measures he took, I was thinking about the argument(s) he made in his manifesto. I read his manifesto back when it was first posted to the internet, around the time of his arrest, so that's what I associate him with, more than with his terrorist activities.

As for freedom of speech, it seems more the case that these days it is highly contingent. If you have the money to purchase space and if the owners of the various media platforms selling space agree to your message, then you can promulgate your message without fear of being arrested by the government because your message is politically or socially unpopular. Free speech belongs to those who own the various media platforms. What has happened is that free speech has been colonized by commerce (even you finished your remark with a nod towards people buying and selling the hypothetical message as product).
"I thought what I'd do was, I'd pretend I was one of those deaf-mutes." - the Laughing Man

This topic is closed to new replies.

Advertisement