8 hours ago, deltaKshatriya said:
I'd argue that eventually capitalism in its current form will not be able to sustain automation. There will be a point when large swathes of people will find themselves out of a job. Take for example truck driving in the US. It is one of the largest professions in the US. It won't be long before it's automated. What then? What will truck drivers do?
Well....
The question here comes down to: will society bow to technological advance? I don't think so.
What will truck drivers do? They will rebel. They will start attacking autonomous trucks. They will try to sabotage them. Maybe some will try to compete and work themselves to death. Some of the political caste will side with them and try to regulate or downright ban autonomous trucks. They might get crafty and attack those drone trucks based on the dangers of autonomous driving (where currently, they would actually be justified).
I don't expect human workforce to go down without a fight.
And even if truck drivers accept their fate and stop working as druck drivers... you really think that this will automagically lead to society moving in the "right" direction (from your point of view)? No, I can tell you the populists on both side will try to use it to instantiate an authocratic state of their design. The right will stick to capitalism far longer than it can sustain itself because of "muh freedom".... the left will **** up their ideas, probably going in the direction you envision, because they always do.
Automation will only lead to social unrest, a widening gap between the rich and the poor, and revolution in the worst case short or mid term. I am sure society can find a way to integrate automation into an evolution of its current economy.... I don't think a readical revolution like algorithm driven communism will be that evolution, and I don't think we can make it work before automation bites us in the ***.
1 hour ago, mikeman said:
So, to re-iterate, even though mass automation and AI may well mean collective ownership of the means of production could be the most rational solution for covering the needs of the many in the 21th century, we should abandon it because communist revolutionaries enacted too much force in order to achieve this "plan" in backwards, agricultural, semi-feudal countries of the 20th century.
Does that make sense? I mean, we could skip the red flag and the hammer&sickle if that's what's bothering you, Kavik. ![:P :P](https://uploads.gamedev.net/emoticons/medium.tongue.webp)
Well, I think there is way more than 50% of the population worldwide currently alive that would say this justification of communism because "it has never been tried" or "automation will solve all problems" is a rather optimistic view on a very complicated matter.
Communism has a bad name because it failed every time it was tried.
Was it ever tried in a region or country were the starting conditions were anything else than piss poor? No, of course not. But will a revolution like this ever happen in a country were the starting conditions aren't piss poor. No again. If the system is running, no politician worth his salary will ever change it.
Its all fine and dandy crafting ingenious theoretical economies that are far superior than the subpar one we currently are soldiering on with. But lets not forget Murphy: a plan (or theory in this case) never survives contact with the enemy (or real life in this case).
Unless you are very, VERY careful how you test and rollout such a system in smal chunks, are ready to roll back or only implement partially, and take a loooong time to achieve all of this, you will never be able to ease a bigger population into such a communist system.
To add to what I said before communism has a bad name because it failed every time... but also because it always lead to faschism and tyranny. Replacing the human tyrant with an algorithm will not make this any more acceptable for most human beings.
You can at least say your leader is a **** if its a human (well, at least you can think that without getting executed... until we get mind reading technology). He can be made responsible for his own deeds.
An algorithm cannot be made responsible. You end up with the worst form of tyranny, because you handed over responsibility to an irresponsible device. You kinda created "a god"... a being that decides on the fate of humans, devoid of emotions, devoid of responsibility, and if that system should work as designed (and thus humans cannot simply control the algorithm), with not way of control over it safe revolution and destruction of that mechanoid god.
A pretty serious dystopia, if you ask me.