Sci-fi : human factor
While recently thinking of desinging world that is beyond ours in terms of technology (our near or far future), I've come accross of a nearly unsolvable problem. Problem is human factor. Human factor is described as magnitude of human impact in all areas of future society, mainly thinking in terms of tecnology advancement. This problem was nicely summarized in Billy Joy's article "Why the future doesn't need us". Once we determine that technology is advanced enough to support complex computer programs which would resemble AI (note that this is a prerequisite) real question is Where would we apply this AI programs to replace humans? Answer may be either everywhere or not-everywhere. However, if we apply AI management at some place and not other, one may argue that designer has made a mistake by not putting AI programs to place B which is logicaly similar to place A where AI has been put. Example : consider huge interstellar ship that is carrying smaller combat units. If we have AI programs to guide these small combat units, shall we put all strategy in hands of an another AI program, or shall we put human general or other figure. Note the difference between AI(artificial intelligence) and ACE (artificial conciousness entity). AI may only be program advanced enough to learn new information but limited to specific problem that it was created for. On the other hand, ACE is self-aware and can learn just about anything within reach of its sensors. WHat do you think? Does the future need us?
So... Muira Yoshimoto sliced off his head, walked 8 miles, and defeated a Mongolian horde... by beating them with his head?
Documentation? "We are writing games, we don't have to document anything".
Documentation? "We are writing games, we don't have to document anything".
As I've said in another post, the game design main nemesis of "many AI combat drones" is common EMP systems.
If the AI's can be easily destroyed remotely by anyone with the future equivalent of a handgun, critical systems will not be entrusted to them.
If the AI's can be easily destroyed remotely by anyone with the future equivalent of a handgun, critical systems will not be entrusted to them.
August 13, 2004 12:45 PM
Replacing humans...
Replace humans in what? Remember that we work to live. Our work doesn't give us anything more that the means to live. In the future machines will take care of every work leaving us to pursue other things life. It will basicly create a carefree society were people do wathever they want because our action will only have repercusions on our social live. You see now if you wakeup late you will be fired from your work, in the future you are not gonna have a job because a machine will do it. Is not a question of if the future need us is a question about what we need in the future.
Replace humans in what? Remember that we work to live. Our work doesn't give us anything more that the means to live. In the future machines will take care of every work leaving us to pursue other things life. It will basicly create a carefree society were people do wathever they want because our action will only have repercusions on our social live. You see now if you wakeup late you will be fired from your work, in the future you are not gonna have a job because a machine will do it. Is not a question of if the future need us is a question about what we need in the future.
Quote:
Original post by Anonymous Poster
Replacing humans...
Replace humans in what? Remember that we work to live. Our work doesn't give us anything more that the means to live. In the future machines will take care of every work leaving us to pursue other things life. It will basicly create a carefree society were people do wathever they want because our action will only have repercusions on our social live. You see now if you wakeup late you will be fired from your work, in the future you are not gonna have a job because a machine will do it. Is not a question of if the future need us is a question about what we need in the future.
Your image of the future (however realistic) is a very grim one in my oppinion.. Imagine a world where every single aspect of work and labourous tasks are handled by machines?!? you say it would give us time to persue other things in life but what other things is there worth persuing? Life would become incredibly easy and humans would never have to learn because the world wouldn't need sciencists, engineers, doctors, lawyers etc as we'd have incredibly powerful computer systems to deal with everything.. Man as a species would then realise that the only activities worth performing would be those which would give personal entertainment and stimulation.. Man would become lazy, self-obsessed, un-educated, selfish, vein and self-indulgent.. Sounds like the kind of world i wouldn't like to live in..
The problem comes from that fact that the more computers and computer technology advances, the more we as a species utilise these systems to do that which we dont want to do, that which we can do but not fast enough, and that which we can't yet do.. but when we reach the point when we can do anything.. when we reach that critical age where humans have achieved all that there is available to achieve that could benefit the human race in terms of overcoming all obstacles, all our weaknesses and all difficulties then what will there be left to achieve?
That's why man will devote all the passion and energy used to carry us forward and achieve greatness, into fulfilling our own personal desires and keeping ourselves occupied as the days roll by..
Its not the sort of future that would be beneficial to us as biological entities because all creatures on the planet were made to work, wether it be for finding food or raising young.. I can just about see the image of a human with no limbs and just a single finger planted in front of a computer panel fitting this kind of future accurately..
In the future, all of mankind will be employeed in middle management, as robots will take over manufacturing, and master AIs will replace corperate CEOs.
Yes, we'll all be dilbert's boss.
Realistically though, there will always be a use for human workers. That use will be to mediate the motion of wealth in a dynamic economy. Think of George Jetson's pushing the red button. I base this on the thinking of Henry Ford, when asked why he pays his workers so much money, his answer was "Because they buy my cars."
Yes, we'll all be dilbert's boss.
Realistically though, there will always be a use for human workers. That use will be to mediate the motion of wealth in a dynamic economy. Think of George Jetson's pushing the red button. I base this on the thinking of Henry Ford, when asked why he pays his workers so much money, his answer was "Because they buy my cars."
william bubel
*ARRRGH!* Pet peeeve #67571: Deterministic futures based on linearly extrapolated trends.
-The splitting of the atom led to the prediction of nuclear powered airplanes and nuclear powered cars.
-The Space Program lead bright minds to predict that there would be millions of humans living and working in space by 2004.
-Chemical engineering discoveries in the earlier half of the 20th Century led to predicted futures of food pills, liberal use of DDT and better living through chemistry.
-In the 1970s, corporate leaders who should have been the ones in the know predicted a worldwide market of only a handful of computers.
-Thomas Jefferson forsaw an America with only a few tens of millions of Americans, largely based on an agricultural economy.
My point?
There are so many possibilities here that I think that any worldbuilder undermines themselves when they say, "It will be so..." Many of these futures suffer from drastic oversimplification, amplification of a single element or factor, and little consideration of the wholistic responses that shape any new technology (market forces, timing, government forces, morality, social conditions, etc.)
It's not a given that AI will replace us. Heck, with biotechnology and nanotech, we may become the machines.
AI development may be banned by nations around the world for ethical reasons, similar in vein to the shaky bans on human cloning. So AI may be rare and illegal.
The problem of emulating all areas of human knowledge may be insurmountable or may lead down roads so similar to human neural biology as to be worthless or cost ineffective (is the mind a quantum computer or a digital computer?)
Future societies with more idle citizens may find political stability dicey; alternately, welfare states may arise which support stronger citizen rights movements which in turn exploit or control the development of AI by votes or mass actions, similar to the worldwide labor movements of the 1900s.
And just how much does it cost to support an AI? What hardware / wetware does it run on? How long does it take to educate? How much of the world is democratic and what do people think about thinking machines? Do these machines have desires of their own, are they self willed, do they suffer such artifacts of consciousness as boredom, anger, suicidal thoughts, curiosity, anxiety or a need for justice or revenge?
I think in terms of game design, because the future is so flexible, you first design where you want the AI to fit in, then provide the explanation for why it is so.
/end rant (sorry, I don't do that often, but this one really bugs me)
-The splitting of the atom led to the prediction of nuclear powered airplanes and nuclear powered cars.
-The Space Program lead bright minds to predict that there would be millions of humans living and working in space by 2004.
-Chemical engineering discoveries in the earlier half of the 20th Century led to predicted futures of food pills, liberal use of DDT and better living through chemistry.
-In the 1970s, corporate leaders who should have been the ones in the know predicted a worldwide market of only a handful of computers.
-Thomas Jefferson forsaw an America with only a few tens of millions of Americans, largely based on an agricultural economy.
My point?
There are so many possibilities here that I think that any worldbuilder undermines themselves when they say, "It will be so..." Many of these futures suffer from drastic oversimplification, amplification of a single element or factor, and little consideration of the wholistic responses that shape any new technology (market forces, timing, government forces, morality, social conditions, etc.)
It's not a given that AI will replace us. Heck, with biotechnology and nanotech, we may become the machines.
AI development may be banned by nations around the world for ethical reasons, similar in vein to the shaky bans on human cloning. So AI may be rare and illegal.
The problem of emulating all areas of human knowledge may be insurmountable or may lead down roads so similar to human neural biology as to be worthless or cost ineffective (is the mind a quantum computer or a digital computer?)
Future societies with more idle citizens may find political stability dicey; alternately, welfare states may arise which support stronger citizen rights movements which in turn exploit or control the development of AI by votes or mass actions, similar to the worldwide labor movements of the 1900s.
And just how much does it cost to support an AI? What hardware / wetware does it run on? How long does it take to educate? How much of the world is democratic and what do people think about thinking machines? Do these machines have desires of their own, are they self willed, do they suffer such artifacts of consciousness as boredom, anger, suicidal thoughts, curiosity, anxiety or a need for justice or revenge?
I think in terms of game design, because the future is so flexible, you first design where you want the AI to fit in, then provide the explanation for why it is so.
/end rant (sorry, I don't do that often, but this one really bugs me)
--------------------Just waiting for the mothership...
I agree. The future is so unstable that it is impossible to determine, with any accuracy, what will happen.
Taking the stated senario into account, there will still always be a job for humans, unless the computer can emulate creativity, and the ability to forsee new applications for existing AI and hardware.
As a side note, I would love to see biotechnology advances increase exponentially. But if I ever got a cyberbrain (ala. Ghost in the Shell) it would defintly run on a unix equivilant.
==^_^==
There are so many unique ways in which the future could evolve, and it is always so nice when game makers develop new IP, I would love to see some new ideas for the future in games.
Taking the stated senario into account, there will still always be a job for humans, unless the computer can emulate creativity, and the ability to forsee new applications for existing AI and hardware.
As a side note, I would love to see biotechnology advances increase exponentially. But if I ever got a cyberbrain (ala. Ghost in the Shell) it would defintly run on a unix equivilant.
==^_^==
There are so many unique ways in which the future could evolve, and it is always so nice when game makers develop new IP, I would love to see some new ideas for the future in games.
Jesse Crafts-FinchPolar Bear Development StudiosProject Lead
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement
Recommended Tutorials
Advertisement