Here's an old paradox where the rationalist might not have the upper hand . What do you guys think? Of course there's a good chance you've read this one before...if so share your thoughts.
An alien comes from many stars away and explains to you that he has the power to predict how you'll make decisions. You are convinced based upon seeing him correctly predict other humans' decisions that he can do this. You're not absolutely 100% sure, but you are 99.99% sure, as sure as we can be of anything in this life.
With this in mind, the alien sets up the following experiment: he gives you two sealed boxes, called A and B. Inside A is $100, and that is a sure thing. Inside of B is either nothing or $1,000,000. The alien says that you are free to take either just box B or to take both boxes. The catch is that he has already made his prediction about which choice you'll make. If he predicted that you'll only take B then he generously stuffed it with the one million (he tells you this). If he predicted that you'll take both boxes then he put nothing in B. After leaving the boxes, the alien departs in his spaceship never to return, leaving you to ponder this decision. Do you take both of the boxes? Or only box B?
This riddle is terribly interesting to me because it is one of the few scenarios where rigidly rational decisions making leads to a less optimal outcome than more "stupid" decision making. But isn't that a contradiction in terms? To me, this paradox illustrates the limits of rationality, or maybe that we just think about rationality in the wrong way. to choose to take either just B or to take A and B.
When does rational thinking fail?
Easy: I take box B and tell my girlfriend to take box A. Problem?
Widelands - laid back, free software strategy
Isn't it rational thinking that tells you to go for just box B because then he'll have put $1,000,000 in it?
Or is that part (under which circumstance he'll have put $1,000,000 in box B ) not known to the decision maker? In that case, the problem is insufficient knowledge - same as I offer you $1,000 for free but I'm not telling you that I'm going to shoot you if you take it - the rational decision is to accept the free money
Point being that any rational decision needs to be an informed decision, meaning that you have checked the facts and know as much as possible about your chance for success, resulting advantages, their likelihood and the negative side, risks and the likelihood of them occurring in case of failure as well - and being aware of what you weren't able to find out. In case of the alien guy, you might try to find out if he has the habit of tricking people and therefore rationally arrive at the conclusion to take box B. In case of the guy with $1,000, the line of dead bodies with $1,000 notes in their hands might give a clue
Or is that part (under which circumstance he'll have put $1,000,000 in box B ) not known to the decision maker? In that case, the problem is insufficient knowledge - same as I offer you $1,000 for free but I'm not telling you that I'm going to shoot you if you take it - the rational decision is to accept the free money
Point being that any rational decision needs to be an informed decision, meaning that you have checked the facts and know as much as possible about your chance for success, resulting advantages, their likelihood and the negative side, risks and the likelihood of them occurring in case of failure as well - and being aware of what you weren't able to find out. In case of the alien guy, you might try to find out if he has the habit of tricking people and therefore rationally arrive at the conclusion to take box B. In case of the guy with $1,000, the line of dead bodies with $1,000 notes in their hands might give a clue
Professional C++ and .NET developer trying to break into indie game development.
Follow my progress: http://blog.nuclex-games.com/ or Twitter - Topics: Ogre3D, Blender, game architecture tips & code snippets.
Follow my progress: http://blog.nuclex-games.com/ or Twitter - Topics: Ogre3D, Blender, game architecture tips & code snippets.
Did the alien actually tell me "If he predicted that you'll only take B then he generously stuffed it with the one million." or is that an assumption I make, because it seems like a big assumption. If he has already 'predicted' it, it doesn't matter if you take both (Which could be a way to make his prediction true), to destroy any prediction you could take niether. However if the aforementioned assumption wasn't an assumption you would take B.
Engineering Manager at Deloitte Australia
Who cares about $100 when there's $1,000,000 in the other box. The pay off for taking box B far exceeds that of box A and if the alien is wrong (unlikely it seems) well your only losing $100 (well your not really losing anything but...). Given that the alien just left the boxes and flew off in his spaceship before you took the boxes, then its all a bit dull anyway.
Take b, if there's nothing in it then take A too, thats the logical choice (you end up taking both and only get $100, alien is correct). What other choice is there really, take b, find theres nothing in it then leave A with $100 just sitting there?
I suppose it depends if you care more for $100 than you do for making the alien wrong .
Take b, if there's nothing in it then take A too, thats the logical choice (you end up taking both and only get $100, alien is correct). What other choice is there really, take b, find theres nothing in it then leave A with $100 just sitting there?
I suppose it depends if you care more for $100 than you do for making the alien wrong .
Interested in Fractals? Check out my App, Fractal Scout, free on the Google Play store.
Rationality is not the same as perfection and does not necessarily lead to optimal solutions at all times. A more interesting (and realistic) scenario is the Prisoner's dilemma, where rational behavior generally leads to suboptimal results.
I would think the OP's story is rather easy to solve: assuming the alien is speaking the truth, one choice gets you one million, the other one hundred. If you take into account probabilities, you get for B = 0.9999 * $1,000,000 + 0.0001 * $1,000,000= $1,000,000 and A+B = 0.9999 * $100 + 0.0001 * $1,000,100 = $200, hence a rational actor still picks B. The only irrational part I can see is in believing the alien can predict your future, but you assign percentages of probability to that, so that makes it moot.
I would think the OP's story is rather easy to solve: assuming the alien is speaking the truth, one choice gets you one million, the other one hundred. If you take into account probabilities, you get for B = 0.9999 * $1,000,000 + 0.0001 * $1,000,000= $1,000,000 and A+B = 0.9999 * $100 + 0.0001 * $1,000,100 = $200, hence a rational actor still picks B. The only irrational part I can see is in believing the alien can predict your future, but you assign percentages of probability to that, so that makes it moot.
Just don't take either of the two boxes, because if he can predict what you want, and if he already knew you wouldn't take anything, he wouldn't have made this experiment.
Rational does not imply optimal nor does it lead to winning strategy.
rigidly rational decisions making leads to a less optimal outcome than more "stupid" decision making.
Plenty of cases demonstrate that rational approach for selection of strategy will frequently be sub-optimal and that many day-to-day events are result in better outcomes using irrational thinking. Especially when it comes to society, where crowd behavior (crowdsourcing) can result in better outcome.
Perhaps a better term would counter-intuitive.
Then there's Parrondo's paradox, where choosing losing strategies results in winning. Thinking of rationality by involving an Oracle is only useful in theory, in practice there are too many unknowns involved.
This topic is closed to new replies.
Advertisement
Popular Topics
Advertisement