Advertisement

What do you think about the Revelation?

Started by July 11, 2011 11:13 AM
471 comments, last by _the_phantom_ 13 years, 1 month ago

[quote name='mikeman' timestamp='1311620158' post='4840126']
What are you talking about? You indeed stated 'my vacuum cleaner hunger is not different than mine' without proving it. You continuously ask from people to prove their opinions while you get to make broad statements without even the hint of an objective evidence.


My assertion is that my sensing the need for something and then reacting to it by seeking it out is no different from my vacuum cleaner doing the same. How exactly do you recommend I prove that? Can I list all aspects and details of human hunger and exhaustively compare each one to my vacuum cleaner's hunger? No, I cannot. To prove that they are NOT different, however, is very easy. One need only list one feature that makes the two qualitatively different.

My assertion does not require any extraordinary evidence because it's not an extraordinary claim. My assertion does not necessitate the existence of a supernatural "soul". My assertion is simply what we all observe. That I receive a "hunger" signal and then seek food, that a lion does the same thing, a slug does the same thing, and that my vacuum does the same thing when it needs power.

It is the claim that they are different that requires some sort of evidence, because it is an assertion that something exists that makes them different. If you make that claim, you're going to have to tell us what it is that makes them different.

You guys are really stretching the boundaries of common sense.
[/quote]

Is it my turn to LMAO?

Yeah, we're the ones stretching common logic...

No buddy, that a human's hunger and a vaccum's cleaner hunger is essentially the same thing is not apparent as 1+1=2. For starters, when a person or animal is 'hungry', it has various effects and interconnections to its psychology and behaviour, and the effects may significantly vary from person to person, in unpredictable ways. The vacuum cleaner, at most, will invariably display 'out of power' signal. The human or animal doesn't just yell 'I am out of food'. It will experience and demonstrate various other feelings and behaviours. It will feel sad. It will feel weak. It will recall days when there was plenty of food, and remember them with nostalgia. It will try to counter the feeling of hunger by mentally focusing on other things. It might feel fear of death in case it doesn't find any food for a long period. It will feel jealousy and anger for those that have enough food. It might feel guilt for feeling jealousy. It will try to sake off that guilt by rationalizing it. It will seek food on its own. Here, did I listed some difference? If you make the claim that they are not qualitalitively different, you're gonna have to prove it. You are not special in that regard. If you can't prove it, you still aren't justified to state that no proof of the opposite is proof of your claim.

For starters, when a person or animal is 'hungry', it has various effects on its psychology and behaviour, and the effects may significantly vary from person to person, in unpredictable ways. The vacuum cleaner, at most, will invariably display 'out of power' signal


Your argument for the two being qualitatively different is that humans act differently while vacuum cleaners at most display an "out of power" signal? I don't understand, please try to be more precise.

If the point that you're getting at is that human hunger is different from vacuum cleaner hunger because humans react to hunger differently from each other while all vacuum cleaners (of a given model) react exactly the same, then you're wrong. Just like humans, smart vacuum cleaners (and really any robot) react differently according to a variety of inputs. A human's physiology and input array are much more complicated than that of any current vacuum cleaner, so it's much harder to predict how any human will react than it is to predict how any robot will react, but that is a quantitative difference rather than qualitative.

If the point you're getting at is that humans have variable processors (brains) with the ability to learn, making each one unique, then robots commonly have the same thing. Give two vacuum cleaners neural networks to allow them to learn a floor plan and they will learn them in different ways according to differences in input. They will then react differently when they feel hunger, just as humans do. This is very basic AI that any CS undergrad that happens to take an AI course has first-hand experience programming.

So, be more precise. What exactly is the difference you're trying to point out, and how exactly does it show a qualitative difference between my vacuum's hunger and mine?
Advertisement

[quote name='mikeman' timestamp='1311623754' post='4840158']
For starters, when a person or animal is 'hungry', it has various effects on its psychology and behaviour, and the effects may significantly vary from person to person, in unpredictable ways. The vacuum cleaner, at most, will invariably display 'out of power' signal


Your argument for the two being qualitatively different is that humans act differently while vacuum cleaners at most display an "out of power" signal? I don't understand, please try to be more precise.

If the point that you're getting at is that human hunger is different from vacuum cleaner hunger because humans react to hunger differently from each other while all vacuum cleaners (of a given model) react exactly the same, then you're wrong. Just like humans, smart vacuum cleaners (and really any robot) react differently according to a variety of inputs. A human's physiology and input array are much more complicated than that of any current vacuum cleaner, so it's much harder to predict how any human will react than it is to predict how any robot will react, but that is a quantitative difference rather than qualitative.

If the point you're getting at is that humans have variable processors (brains) with the ability to learn, making each one unique, then robots commonly have the same thing. Give two vacuum cleaners neural networks to allow them to learn a floor plan and they will learn them in different ways according to differences in input. They will then react differently when they feel hunger, just as humans do. This is very basic AI that any CS undergrad that happens to take an AI course has first-hand experience programming.

So, be more precise. What exactly is the difference you're trying to point out, and how exactly does it show a qualitative difference between my vacuum's hunger and mine?
[/quote]


So your argument is, 'when the vaccum cleaner is out of power, it does things' and 'when the human is out of food, it does things', therefore 'the 2 are the same'? Really? I showed that the 'things' are qualitively different. The vacuum cleaner displays a message so the owner will know and connect it to a power source. That's the behaviour. A quantitive diffence would be a difference in the message. A qualitalive difference would be a different set of behaviours altogether. Self-reflecting on your condition and demonstrating sadness,anger,jealousy,agony,fear,embarassment, effort to forget about it by tricking your own brain and shifting your focus to other things are qualitively different than 'display message'.
[color=#1C2837][size=2]
[color=#1C2837][size=2]It will feel sad[color=#1C2837][size=2][/quote]
[color="#1c2837"]This is a signal, not a response. My brain says "I am sad that I am hungry" which means "avoid becoming hungry." My vacuum cleaner has this built in.

[color=#1C2837][size=2]

[color=#1C2837][size=2]
It will feel weak.[/quote]
[color=#1C2837][size=2]My vacuum cleaner will eventually run out of juice if it doesn't find its base. Before it does that it runs down its battery, and becomes weak. It no longer moves as fast, until it stops completely.
[color=#1C2837][size=2]

[color=#1C2837][size=2]
It will recall days when there was plenty of food, and remember them with nostalgia.[/quote]
[color=#1C2837][size=2]My vacuum cleaner doesn't happen to include software to store information about previous times it charged at its base station, but there's no reason it couldn't.
[color=#1C2837][size=2]

[color=#1C2837][size=2]
It will try to counter the feeling of hunger by mentally focusing on other things.[/quote]
[color=#1C2837][size=2]This is simple enough to program.
[color="#1c2837"]

[color=#1C2837][size=2]if (cant_find_food)
[color=#1C2837][size=2]{
[color=#1C2837][size=2] // counter feeling of hunger by focusing on other things
[color=#1C2837][size=2] currentPriorityAction = PickRandomActionOtherThanFindFood();
[color=#1C2837][size=2]}
[color=#1C2837][size=2]

[color="#1c2837"]

[color=#1C2837][size=2]
It might feel fear of death in case it doesn't find any food for a long period.[/quote]
[color=#1C2837][size=2]I don't know if my vacuum cleaner knows that it's getting close to running out of juice, but obviously it easily could. And it could start flashing its red light harder when it did. Its speed could increase, to mimic our flight-or-flight response, if you like.
[color=#1C2837][size=2]

[color=#1C2837][size=2]
It will feel jealousy and anger for those that have enough food.[/quote]
[color=#1C2837][size=2]First of all, you're describing uniquely human traits here. No non-human animals (or at least few non-human animals) get jealous. But you could program that in if the vacuum cleaner had the capacity to sense that other vacuum cleaners have found their charging station.
[color=#1C2837][size=2]
[color=#1C2837][size=2]if (do_sense_that_other_vacuums_are_charging)
[color=#1C2837][size=2]{
[color=#1C2837][size=2] desireToCharge++;
[color=#1C2837][size=2] angerTowardOtherVacuum++;
[color=#1C2837][size=2]}
[color=#1C2837][size=2]

[color=#1C2837][size=2]

[color=#1C2837][size=2]
It might feel guilt for feeling jealousy. It will try to sake off that guilt by rationalizing it.[/quote]
[color=#1C2837][size=2]Making the emotions more and more complex simply makes them harder and harder to program in. Again, a quantitative difference, not a qualitative difference.
[color=#1C2837][size=2]

[color=#1C2837][size=2]
It will seek food on its own.[/quote]
[color=#1C2837][size=2]Uh, what? Did you not read much of the conversation? Direct quote from me:
[color=#1C2837][size=2]"[color=#1C2837][size=2]My vacuum cleaner seeks its home when it's running out of power."

So your argument is, 'when the vaccum cleaner is out of power, it does things' and 'when the human is out of food, it does things', therefore 'the 2 are the same'? Really? I showed that the 'things' are qualitively different. The vacuum cleaner displays a message so the owner will know and connect it to a power source. That's the behaviour. A quantitive diffence would be a difference in the message. A qualitalive difference would be a different set of behaviours altogether. Self-reflecting on your condition and demonstrating sadness,anger,jealousy,agony,fear,embarassment, effort to forget about it by tricking your own brain and shifting your focus to other things are qualitively different than 'display message'.



Why is it you people find it so hard to read? My vacuum cleaner seeks power just as you seek food. It literally looks for power. It wanders around my house looking for it until its food so that it can alleviate its negative feelings of "need food".

It really is a chore to try to argue in a civil way with you people. My feelings discussed earlier in this thread are verified again and again. Could you guys be so kind as to tell me how to ignore each of you? I promise you'll never hear from me again :)

[color="#1c2837"]
[color="#1c2837"]It will feel sad[color="#1c2837"]

[color="#1c2837"]This is a signal, not a response. My brain says "I am sad that I am hungry" which means "avoid becoming hungry." My vacuum cleaner has this built in.

[color="#1c2837"]

[color="#1c2837"]
It will feel weak.[/quote]
[color="#1c2837"]My vacuum cleaner will eventually run out of juice if it doesn't find its base. Before it does that it runs down its battery, and becomes weak. It no longer moves as fast, until it stops completely.
[color="#1c2837"]

[color="#1c2837"]
It will recall days when there was plenty of food, and remember them with nostalgia.[/quote]
[color="#1c2837"]My vacuum cleaner doesn't happen to include software to store information about previous times it charged at its base station, but there's no reason it couldn't.
[color="#1c2837"]

[color="#1c2837"]
It will try to counter the feeling of hunger by mentally focusing on other things.[/quote]
[color="#1c2837"]This is simple enough to program.
[color="#1c2837"]

[color=#1C2837][size=2]if (cant_find_food)
[color=#1C2837][size=2]{
[color=#1C2837][size=2] // counter feeling of hunger by focusing on other things
[color=#1C2837][size=2] currentPriorityAction = PickRandomActionOtherThanFindFood();
[color=#1C2837][size=2]}
[color=#1C2837][size=2]

[color="#1c2837"]

[color="#1c2837"]
It might feel fear of death in case it doesn't find any food for a long period.[/quote]
[color="#1c2837"]I don't know if my vacuum cleaner knows that it's getting close to running out of juice, but obviously it easily could. And it could start flashing its red light harder when it did. Its speed could increase, to mimic our flight-or-flight response, if you like.
[color="#1c2837"]

[color="#1c2837"]
It will feel jealousy and anger for those that have enough food.[/quote]
[color="#1c2837"]First of all, you're describing uniquely human traits here. No non-human animals (or at least few non-human animals) get jealous. But you could program that in if the vacuum cleaner had the capacity to sense that other vacuum cleaners have found their charging station.
[color="#1c2837"]
[color=#1C2837][size=2]if (do_sense_that_other_vacuums_are_charging)
[color=#1C2837][size=2]{
[color=#1C2837][size=2] desireToCharge++;
[color=#1C2837][size=2] angerTowardOtherVacuum++;
[color=#1C2837][size=2]}
[color=#1C2837][size=2]

[color="#1c2837"]

[color="#1c2837"]
It might feel guilt for feeling jealousy. It will try to sake off that guilt by rationalizing it.[/quote]
[color="#1c2837"]Making the emotions more and more complex simply makes them harder and harder to program in. Again, a quantitative difference, not a qualitative difference.
[color="#1c2837"]

[color="#1c2837"]
It will seek food on its own.[/quote]
[color="#1c2837"]Uh, what? Did you not read much of the conversation? Direct quote from me:
[color="#1c2837"]"[color="#1c2837"]My vacuum cleaner seeks its home when it's running out of power."
[/quote]

I'm sorry, are you describing the vaccum cleaner we all know, or a fictional robot? Your vaccum cleaner returns to a pre-programmed position and displays a message. I described a whole another set of behaviours that your vacuum cleaner does not demonstrate. Which part of those didn't you read? What exactly did you mean by 'qualitalive difference'? Your 'programming' examples are completely laughable, equating anger and jealousy with the increment of an integer variable. You left out the difficult part buddy, actually implementing those feelings and the ability to self-reflect on them, and write,say an essay, pondering on your feelings and thoughts about these feelings and these thoughts.
Advertisement

Why is it you people find it so hard to read? My vacuum cleaner seeks power just as you seek food. It literally looks for power. It wanders around my house looking for it until its food so that it can alleviate its negative feelings of "need food".


I very often avoid food when I am hungry. I also very often eat food when I am not hungry or am completely full and in danger of vomiting from over-consumption. Sometimes I will eat anticipating that I will be hungry despite not being hungry. Other times I will eat because others are eating; though this does not always apply and the choice is not random. Sometimes I will be extremely hungry and wait to eat; this has happened with pretty much exactly the same inputs: well stocked fridge, me doing nothing but sitting on the couch watching tv, similar states of energy and hunger. Does your vacuum display such an array of responses to it's input of hunger?

It's more than just random jitter that decides the outcome, and it's that difference in outputs despite similar inputs that I think is most curious. Would a computer even be sitting on the couch doing nothing productive in the first place? If it were would it be relishing in the comfort or would it simply be hibernating until some predetermined timer elapses?

These are questions we don't know the answers to and we won't know until we make machines that are more capable. That is why I started by saying:
On sentient AI. I would be really curious to know if a sentient AI would find things to think about for fun rather than just expanding it's knowledge-base. I feel like that would be a major departure between sentient AI and humans. Where humans might delight in being entertained or relaxing rather than constantly being productive I feel like a sentient AI would focus on being productive and rather than entertaining itself or relaxing it would just shut itself off till it could be productive again.[/quote]
rather than:
machines are stupid trolololololol[/quote]

So, be more precise. What exactly is the difference you're trying to point out, and how exactly does it show a qualitative difference between my vacuum's hunger and mine?

Prolonged 'hunger' of a vacuum cleaner let's say, 2 years without being 'fed' electricity will just cause it not to vacuum. A human or many mammals under the same circumstances would actually die. I find it interesting that you're giving a vacuum human qualities but in a few posts ago said that human are not special. It's, honestly, interesting. I will continue to observe this conservation.

Since you didn't make a rebuttal with my post, I'll take that as you don't disagree.

Beginner in Game Development?  Read here. And read here.

 



1) is wrong as you've already assumed that universe was created


If I'm not mistaken, modern science states that the universe was created via the Big Bang about 15 billion years ago.
[/quote]

No, it does not say that. It says the universe was just in a different state. The word 'created' implies a cause. We don't know what happened near the beginning of the Big Bang.
Also there is no 'physical universe', just universe.
Still you haven't answered the previous question.

[quote name='Telgin' timestamp='1311340536' post='4838884']
Think of it this way, if you rewound your life to yesterday (having never experienced it at all), then wouldn't you go through it exactly the same way the second time? If not, why not?

The things you have learned coupled with what you experienced that day would cause you to behave identically.


You don't know this to be true and there's no way to know it. That's poor logic.
[/quote]

You're right, of course. Bad example. The point I was driving at is that I believe (and most others probably agree) that if you set up a system in exactly the same way twice, the outcomes will be the same. cowsarenotevil probably described all of this better than I could and the thread has diverged from that topic sufficiently that it's probably best to leave this where it is.

I skimmed the thread up to this point and haven't seen anyone address the other part of my post though. I still don't see how one can reconcile an omniscient god with punishment for bad behavior. Free will doesn't even have to be part of the equation.

The discussion on the vacuum is a bit interesting. Wish I had something to add to it.
Success requires no explanation. Failure allows none.

This topic is closed to new replies.

Advertisement