Newcomb's Paradox

Newcomb's Paradox is a well-known problem, and I won't try to go through all the angles, interpretations and arguments.  The basic problem is this (taken from Wolfram):

Given two boxes, B1 which contains $1000 and B2 which contains either nothing or a million dollars, you may pick either B2 or both. However, at some time before the choice is made, an omniscient Being has predicted what your decision will be and filled B2 with a million dollars if he expects you to take it, or with nothing if he expects you to take both.

It's common to suppose that the Predictor is not necessarily omniscient.  It can just be an extremely reliable supercomputer, say.  Grey's Labyrinth gives a nice introduction to the problem and a very clever go at a solution, too.  The claim is that it is most rational to choose just one box.  I agree.  Here's why.

First off, I don't think the paradox should be taken as an argument against free will, or against the compatibility of free will and determinism.  It might force us to think about free will and determinism in unusual ways, but that is all.

People who claim you should take both boxes emphasize that, no matter what the Predictor predicts, the money will already be in place, so there would be no reason not to choose both boxes.  The money's already there!  So they choose both boxes . . . and end up with $1,000.  Because, obviously, the Predictor will have known that they were going to think that way.

But what's the alternative?  You'd have to be stupid to just choose one box, right?

Not if you take the Predictor seriously.

What I suspect is that people who believe you should take both boxes don't take the Predictor seriously enough.  They believe that the Predictor can't really know how they will act:  Either their action is in some ways independent of its causal history or the Predictor can't take enough variables into consideration to do what it is stipulated to do.

Our actions can be extremely sensitive to external conditions, so that it seems practically impossible for the Predictor to accurately predict what people will do ahead of time.  We cannot imagine the level of knowledge the Predictor must have to accurately predict our behavior, and this is perhaps why most people just don't take the Predictor seriously enough--and that's why the option of taking both boxes is perhaps most attractive.  It's hard to take the Predictor seriously, not because we intuitively believe we have contra-causal free will, but because our actions are so utterly unpredictable.

People who take the Predictor seriously will accept that the Predictor knows (or comes close enough to knowing) what they are going to do.  So you should try your hardest to just take that one box.

It might be too hard.  You might think, "But the money's already there!!"  And that's true, and so you can take both boxes . . . and end up with $1,000.  Because to take both boxes and end up with $1,001,000 is, if not impossible, as close to impossible as you can get.

Comments

Popular posts from this blog

An Argument For Compatibilism

Sam Harris and the Moral Realism/Moral Relativism Myth

Luke Skywalker and Rey: Comparing Character Arcs