Um..no it's not a small problem. We are already dependent on the sun, and there's a limited supply of energy. Getting into the cyber world makes you even more vulnerable to energy supply. Now you are depending the constant supply of electric current. Somebody trip over the wire, some dumbass think you haven't been paying the bill, some janitors push the off button, earthquake, anything could happen that can erase you permanently.
Yeah, sure, you can be invincible there, but to the outside world (here), you can be killed with a flip of switch.
Whilst there are valid concerns as to being dependent on hardware that you have no control over, I disagree it's that bad. Why would loss of power be a problem? Like any data, I presume there would be stored (autosaved) data that doesn't require constant power, as well as a good backup system. (You might lose temporarily cached data, but you wouldn't be dead.)
I would hope that any system like this has better reliability and backup than what I use for my mp3s ;)
Now here is an interesting question to go along with this: What are your Rights? Both "Human" rights, and "Personal Rights" > Such as the right to vote in elections of your country and stuff.
If your mind is copied to a computer, does "Computer You" have any legal rights? Should it?
If the computer is paused, the data copied to another computer, and both systems turned back on and allowed to 'diverge' from each other as both 'persons' experience different lives from that point forward, do they count as one, or two people? Can they both vote for the same party in an election?
If you format the storage systems and wipe someone's mind out, does it count as murder?
If you take the view that a mind in a box is not a person under the law, where do you draw the line? Do people who lose a limb or eye, and get a synthetic replacement lose their rights as a human? How about a little more of their body? If you carefully carve away, bit by bit, replacing one biological system after another with hardware, at what point do they no longer count as a human or person, with all the rights that goes with it?
And of course the big one: If a 'mind' is created without copying it directly from another human, but is 100% identical in function and operation to a random sample of human copied minds, should it be granted the same rights? Something 100% artificial vs an artificial copy of a natural brain.
Old Username: Talroth
If your signature on a web forum takes up more space than your average post, then you are doing things wrong.
Now here is an interesting question to go along with this: What are your Rights? Both "Human" rights, and "Personal Rights" > Such as the right to vote in elections of your country and stuff.
If your mind is copied to a computer, does "Computer You" have any legal rights? Should it?
If the computer is paused, the data copied to another computer, and both systems turned back on and allowed to 'diverge' from each other as both 'persons' experience different lives from that point forward, do they count as one, or two people? Can they both vote for the same party in an election?
If you format the storage systems and wipe someone's mind out, does it count as murder?
If you take the view that a mind in a box is not a person under the law, where do you draw the line? Do people who lose a limb or eye, and get a synthetic replacement lose their rights as a human? How about a little more of their body? If you carefully carve away, bit by bit, replacing one biological system after another with hardware, at what point do they no longer count as a human or person, with all the rights that goes with it?
And of course the big one: If a 'mind' is created without copying it directly from another human, but is 100% identical in function and operation to a random sample of human copied minds, should it be granted the same rights? Something 100% artificial vs an artificial copy of a natural brain.
Well it seems to me running multiple copies of ones mind would have to be illegal since one could pretty much force their ideals and beliefs through any political system even without the right to vote. Also developing synthetic minds may also be illegal for much the same reasons. However, these questions are mostly unanswerable because we do not know how this technology will emerge and in what type of society.
Not entirely! In the Battlestar prequel series Caprica, they (or at least some of them) were artificial reconstructions based on real-life people. That's totally what they're talking about here!
Life in the Dorms -- comedic point-and-click adventure game out now for Xbox Live Indie Games!
In Caprica it was an AI assuming the role of an existing entity based on a vast collection of data on the subject, not the entity itself copied over to the computer. Still artificial, and many would argue this is a huge difference.
Old Username: Talroth
If your signature on a web forum takes up more space than your average post, then you are doing things wrong.
We all know this is how the Cylon War stars, right?
Pff, hardly. Cylons were artificial.
This is far more like the Titans in Frank Herbert's Dune before the rise of the Thinking Machines. Totally different.
Not entirely! In the Battlestar prequel series Caprica, they (or at least some of them) were artificial reconstructions based on real-life people. That's totally what they're talking about here!
In Caprica it was an AI assuming the role of an existing entity based on a vast collection of data on the subject, not the entity itself copied over to the computer. Still artificial, and many would argue this is a huge difference.
It is amusing how quickly we humans resort to works of fiction in order to visualize what impact something may have when rarely the fictional predictions line up with reality. I don't think either series accurately predicts the out come of such a technology neither do I believe any of our personal predictions will be the true state of things. I think it is very much related to the social climate at the time of its availability.
Living exclusively inside a virtual environment is a bit problematic. You are effectively relinquishing control of your existence to an organisation that maintains the computer system. If the machine is switched off, you will no longer exist. Naturally, you will never know about the power-down as it happens, but the constant thought of that it could happen is a bit too much. So, no, I would not want to live in a virtual environment.
However, what I'd like to experience is perhaps an augmentation of my mind with an artificial system that expands the capability of my concsiousness. Increasing my learing and memory capability would be awesome. And having the ability to remotely control things is also conventient. Best of both worlds!
Perhaps the only situation where a pure virtual environment would be attractive is when my body fails. Even then, if technology is capable of transferring your consciousness into another medium, then surely medicine would be advanced enough to repair your body.
Regarding procedural universes, it's an attractive idea. If you really think about it, our real world is governed by a collection of "simple" rules, constants and relationsips which creates a complex, self-organising universe. However, if you were to "fix" bugs in a virtual world equivalent, meaning changing one of the fundamental rules, you could effectively change everything for the better or worse. Aka the butterfly effect. It's a tricky bussines.