this post was submitted on 22 Jun 2023
20 points (91.7% liked)

Asklemmy

44149 readers
1446 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy πŸ”

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~

founded 5 years ago
MODERATORS
20
Deleted (lemmy.dbzer0.com)
submitted 2 years ago* (last edited 1 year ago) by IsThisLemmyOpen@lemmy.dbzer0.com to c/asklemmy@lemmy.ml
 

Deleted

top 46 comments
sorted by: hot top controversial new old
[–] lvxferre@lemmy.ml 16 points 2 years ago* (last edited 2 years ago) (2 children)

I'll abstract the problem a tiny bit:

  • a = the prize in box A
  • ka = the potential prize in box B; i.e. "k times larger than a"
  • p = the odds of a false positive. That is, the odds that you pick box B only and it got nothing, because dumb machine assumed that you'd pick A too.
  • n = the odds of a false negative. That is, the odds that you pick A+B and you get the prize in B, because the machine thought that you wouldn't pick A.

So the output table for all your choices would be:

  1. pick nothing: 0
  2. pick A: a
  3. pick B: (1-p)ka
  4. pick A+B: a + nka

Alternative 4 supersedes 1 and 2, so the only real choice is between 3 (pick B) or 4 (pick A+B).

You should pick A+B if a + nka > (1-p)ka. This is a bit messy, so let's say that the odds of a false positive are the same as the odds of a false negative; that is, n=p. So we can simplify the inequation into

  • a + nka > (1-n)ka // subbing "p" with "n"
  • 1 + nk > (1-n)k // divided everything by a
  • 1 + nk - (1-n)k > 0 // changed sides of a term
  • 1 + 2nk -k > 0 // some cleaning
  • n > (k-1)/2k // isolating the junk constant

In OP's example, k=1000, so n > (1000-1)/(2*1000) β†’ n > 999/2000 β†’ n > 49.95%.

So you should always pick B. And additionally, pick A if the odds that the machine is wrong are higher than 49.95%; otherwise just B.

Note that 49.95% is really close to 50% (a coin toss), so we're actually dealing with a machine that can actually predict the future somewhat reliably, n should be way lower, so you're probably better off picking B and ignoring A.

[–] IsThisLemmyOpen@lemmy.dbzer0.com 4 points 2 years ago (1 children)

I'll abtract the problem...

Proceeds to teach calculus

[–] lvxferre@lemmy.ml 2 points 2 years ago

BRB, finding a way to insert derivation by parts into that. :^)

[–] joobeejoo47@kbin.social 2 points 2 years ago
[–] autumn@reddthat.com 5 points 2 years ago* (last edited 2 years ago) (1 children)

Does the machine know that box is not an option? Thinking that A, A+B, and B are all valid options is something I could see an AI doing.

There also isn't much penalty for taking A+B? You'll always get at least $1M. And if you took only B, the max you could get is $1M.

Edit: I can't read lol. I'd still take both, the result is

Predicted A+B: $1M

Predicted B only: $1001M

If you only take box B, the result is

Predicted A+B: $0

Predicted B only: $1000M

[–] IsThisLemmyOpen@lemmy.dbzer0.com 3 points 2 years ago* (last edited 2 years ago)

I think the machine predicts 2 results, either

Box A is taken = True

OR

Box A is taken = False

Something like:

If (Box A Taken = True)
{place ($0) in Box B}
else
{place ($1,000,000,000) in Box B}

Machine doesnt care if you also take Box B, it only cares if Box A is one of the boxes taken. If you take no boxes, Box B would still have a billion dollars, although thats kinda dumb choice from a gameshow host's perspective.

[–] Valmond@lemmy.ml 5 points 2 years ago* (last edited 2 years ago)

Box A.

You never know the shenanigans of a machine, and one million is largely enough for me until I die, or if science gives us the option to live forever I bet machines will do the work for us :-)

Edit: as I believe the machine can be wrong, I'd probably take A + B

[–] dan1101@lemmy.world 5 points 2 years ago (1 children)

I'd take box A and B because that would get me 1 MILLION DOLLARS. Yes I'm risking 1 BILLION DOLLARS but I'd rather have a guaranteed million.

Hehe thats why I think the original question of Box A being $1000 and Box B being a million was kinda boring, since $1000 is barely anything in today's world. 3 more zeroes does making things more interesting

[–] wols@lemmy.ml 5 points 2 years ago

I think the major unanswered question is how reliable do we think the machine is? 50%? 100%? I think the most interesting scenario is one where we are convinced that the machine actually predicts the future and always predicts correctly, so I'll continue with that assumption in mind.

From one point of view, we have no reason not to take both boxes, since we can't alter the machine's prediction now, it's already happened. I think however that this undermines my premise. Choosing both boxes only makes sense if we don't actually believe the machine predicts the future.

One would be tempted to say "alright, then I will choose only box B, as the machine will have predicted that and I will get lots of money. If I were to choose both boxes, the machine would have predicted that too, and I would get much less money.

My argument is that both answers are wrong in a sneaky way: assuming an actual perfect predictor, my answer is box B only. However, the important part here is that this will not be, in fact, a choice. The result was already determined ahead of time, so I really only had that one option.

[–] Flicsmo@rammy.site 3 points 2 years ago

Well if it's a machine that's 100% correct in its predictions obviously I'd take box B since that'd be a guaranteed billion - but assuming it's fallible, I'd go with A+B. A million dollars is plenty of money, I don't even know what I'd do with a billion.

[–] RedMarsRepublic@vlemmy.net 3 points 2 years ago

Box B only, why should I presume I'm smarter than this machine?

[–] kthxbye_reddit@feddit.de 3 points 2 years ago (2 children)

The best case result is 1.001.000.000 (A+B) vs 1.000.000.000 (B) only. Worst case is I have 1.000.000 only.

I go with B only because the difference feels tiny / irrelevant.

Maybe I actually have free will and this is not determism kicking in, but who knows. Iβ€˜m not in for the odds with such a tiny benefit.

[–] wols@lemmy.ml 2 points 2 years ago (1 children)

Well if you actually have free will, how can the machine predict your actions?

What if someone opened box B and showed you what was in it? What would that mean? What would you do?

[–] kthxbye_reddit@feddit.de 1 points 2 years ago* (last edited 2 years ago)

I meant, let’s imagine the machine predicted B and is wrong (because I take A+B). I would call that scenario β€žI have free will - no determinism.β€œ Then I will have 1.000.000.000 β€žonlyβ€œ. That’s a good result.

Maybe interesting: Wiki - Determinism

[–] OptimusFine@kbin.social 2 points 2 years ago (1 children)

Worst case is I have 1.000.000 only.

Except that's not the worst case. If the machine predicted you would pick A&B, then B contains nothing, so if you then only picked B (i.e. the machine's prediction was wrong), then you get zero. THAT'S the worst case. The question doesn't assume the machine's predictions are correct.

[–] kthxbye_reddit@feddit.de 1 points 2 years ago* (last edited 2 years ago)

Good point. Actually I was assuming that the machine’s predictions were never wrong. That’s also what is defined in the Newcomb’s Paradox wiki page.

If thatβ€˜s not a 100% given, you are definitely right.

[–] technopagan@feddit.de 3 points 2 years ago

Box A only because ain't nobody got time for any "Paradox" BS & daddy's got bills to pay. Trick someone else with all that Time Travel nonsense!

[–] FlowVoid@kbin.social 3 points 2 years ago (1 children)

It's much easier if you reframe the problem:

Someone says they've built a machine that can perfectly predict what you will do. Do you believe them?

If so, take one box.
If not, take both boxes.

[–] CoderKat@kbin.social 1 points 2 years ago (1 children)

But even if you don't believe them, it's got a 50% chance on a coin toss.

[–] FlowVoid@kbin.social 2 points 2 years ago (1 children)

Regardless of whether the machine is right, if you don't believe it can perfectly predict what you'll do then taking both boxes is always better than just one.

[–] mr-strange@kbin.social 1 points 2 years ago

Yeah, at least you'll have an extra box to keep your stuff in.

[–] Pagliacci@lemmy.ml 3 points 2 years ago

I'd take both boxes.

We've been given no information on the accuracy of the machine's predictions. Therefore, we have to assume it has just as good of a chance of being wrong as being right. There's essentially a 50/50 chance that box B has $1,000,000,000 regardless of my choice, so I would choose the option that at least guarantees the smaller prize while still giving me the same chance at the larger prize.

[–] TauZero@mander.xyz 3 points 2 years ago

Here's my solution to Newcomb's Paradox: the predictor can be perfectly infallible if it records your physical state and then runs a simulation to predict which box you'll pick. E.g. it could run a fancy MRI on you as you are walking through the hallway towards the room, quickly run a faster-than-real-time physical simulation, and deposit the correct opaque box into the room before you open the door. The box, the hallway, the room, the door are all part of the simulation.

Here's the thing: a computer simulation of a person is just as conscious as a physical person, for all intents of "consciousness". So as you are inside the room making your decision, you have no way of knowing if you are the physical you or the simulated you. The predictor is a liar in a way. The predictor is telling the simulated you that you'll get a billion dollars, but stating the rules is just part of the simulation! The simulated you will actually be killed/shut down when you open the box. Only the physical you has a real chance to get a billion dollars. The predictor is counting on you to not call it out on its lie or split hairs and just take the money.

So if you think you might be in a simulation, the question is: are you generous enough towards your identical physical copy from 1 second ago to cooperate and one-box? Or are you going to spitefully deprive them of a billion dollars by two-boxing just because you are about to be killed anyway? Remember, you don't even know which one you are. And if you are the spiteful kind, consider that we are already making much smaller time-cooperative trade-offs all the time, such as the you-now taking a breath just so that the you-five-seconds-from-now doesn't suffocate to death.

What if the predictor doesn't use a MRI or whatever? I posit that whatever prediction method it uses, if the method is sufficiently advanced to be infallible then somewhere in the process it MUST be creating conscious observer instances.

[–] elavat0r@mander.xyz 2 points 2 years ago

I'd much rather take a sure million with a (slight?) chance of a bonus billion, versus an unknown chance at 0 or a billion. I could do plenty with a million that would significantly change my life for the better.

But I would probably do the opposite if A contained $1000 and B contained a potential million as in the original example. $1000 is a tolerable amount to risk missing out on.

[–] Cameli_Hostis@lemmy.world 2 points 2 years ago

Do I have access to a What-If machine and the Finglonger?

[–] ulu_mulu@lemmy.world 2 points 2 years ago* (last edited 2 years ago)

If I wanted to use logic, I'd say taking both A and B is the only way to have a guaranteed $1,000,000 outcome, because B only could get you money but also nothing.

But, if I choose B only, I'm sort of "forcing" the machine into that kind of prediction, right? I don't know about this experiment, but since your post says it's a paradox, I think that's how it works.

So my choice is B only, the machine has predicted it and I get a nice $1,000,000,000.

Am I totally off? :D

[–] XPost3000@lemmy.ml 2 points 2 years ago

Both, like a million dollars guaranteed I could live a comfortable life with that

[–] Sordid@kbin.social 2 points 2 years ago* (last edited 2 years ago) (1 children)

Both! Critically, the contents of box B depend on the machine's prediction, not on whether it was correct or not (i.e. not on your subsequent choice). So it's effectively a 50/50 coin toss and irrelevant to the decision-making process. Let's break down the possibilities:

Machine predicts I take B only, box B contains $1B:

  • I take B only - I get $1B.
  • I take both - I get $1.001B

Machine predicts I take both, box B is empty:

  • I take B only - I get nothing.
  • I take both - I get $1M.

Regardless of what the machine predicts, taking both boxes produces a better result than taking only B. The question can be restated as "Do you take $1M plus a chance to win $1B or would you prefer $0 plus the same chance to win $1B?", in which case the answer becomes intuitively obvious.

[–] FlowVoid@kbin.social 1 points 2 years ago (2 children)

But if it's true that the machine can perfectly predict what you will choose, then by definition your choice will be the same its prediction. In which case, you should choose one box.

[–] annegreen@sh.itjust.works 2 points 2 years ago

Though OP never actually stated that the machine can perfectly predict the future. If that’s the case, then yes, you should just take box B. But we’re not given any information about how it makes its prediction. If @Sordid@sh.itjust.works is correct in assuming it’s a 50-50, then their strategy of taking both is best. It really depends on how the machine makes its prediction.

[–] Sordid@kbin.social 1 points 2 years ago* (last edited 2 years ago)

No information regarding the machine's accuracy is provided, but the fact that you are asked to make a choice implies that it is not perfect. The question explicitly specifies that the prediction has already been made and the contents of box B have already been set. You can't retroactively change the past and make the money appear or disappear by making a decision, so if your choice must match the prediction, then it's not your choice at all. You lack free will, and the decision has already been made for you by the machine. In that case the entire question is meaningless.

[–] gk99@kbin.social 1 points 2 years ago* (last edited 2 years ago) (1 children)

The machine has already done it's prediction and the contents of box B has already been set. Which box/boxes do you take?

If my choices don't matter and the boxes are predetermined, what point is there to only taking one box? The machine already made its choice and filled the boxes, so taking both boxes is always the correct answer. Either I get $1,000,000 if the machine thought I would take both, or I get $1,001,000,000 if it didn't. This is a false dilemma, there is never a reason to take just one box.

[–] IsThisLemmyOpen@lemmy.dbzer0.com 1 points 2 years ago (1 children)

This isn't a false dillemma. Imagine if the way the machine predicts is by copying your brain and putting it in a simulated reality, then the copy of you gets asked to choose which boxes to take, the exact same way and be given the exact same information. Under this assumption, the machine could predict with 100% accuracy what the real you would've chosen.

How do you know you are even the real you. You could just be the machine's simulation of the real you.

There is a dilemma and the dilemma is about how much you want to trust the machine.

[–] FlowVoid@kbin.social 1 points 2 years ago* (last edited 2 years ago)

If you are a simulation, then your choice doesn't matter. You will never get any real benefit from the boxes. It's like saying, "there is also a finite possibility that the machine is lying and all the boxes are empty". In which case, the choice is again irrelevant.

Situations in which your choice doesn't matter are not worth considering. Only the remaining possibility, that you are not a simulation and the machine is not lying, is worth considering.

[–] Hudell@lemmy.dbzer0.com 1 points 2 years ago

I would assume the machine would predict I take both because it would know me too well to belive there would be anything in B, so I would take A.

[–] AnonTwo@kbin.social 1 points 2 years ago

I feel like unless we're talking about supernatural AI the only answer is A&B

Otherwise the box has no real way of knowing what you would've picked, so it's complete RNG.

If there was a realistic way that it could make that decision i'd choose only B, but otherwise it just doesn't make sense.

edit: I also didn't realize until after I read it that box A always has the million dollars. So there's actually no reason to pick only box B in this scenario. The paradox only makes sense if box A is significantly less than box B. It's supposed to be a gambling problem but A&B is completely safe with the changes made.

[–] kakes@sh.itjust.works 1 points 2 years ago

Box A and B hands down. 1,000,000 birds in the hand are worth 1,000,000,000 in the bush.

[–] MrComradeTaco@lemmy.fmhy.ml 1 points 2 years ago* (last edited 2 years ago)

I can make more betting in horse races with Box B.. so i will go Box B.

[–] hypelightfly@kbin.social 1 points 2 years ago (1 children)

Box A and B as the prediction has already been made so the choice has no bearing on the contents at this point. You either get the guaranteed million or both.

Well what you choose may not direct affect what is inside Box B, but there is still a huge difference between the two choices.

Imagine the way that the machine did it's prediction was copying your brain and making this copied brain choose in a simulation. Assuming the copied brain is completely identical to your brain, the machine could predict with 100% accuracy what the real you would choose. In this sense, what you choose can affect what's inside Box B (or rather, what your copied brain chooses can affect whats inside Box B).

One more thing to think about: How do you know that you aren't the simulated brain that's been copied?

[–] Double_A@kbin.social 0 points 2 years ago (1 children)

To throw some extra spice in this: What happens if the player decides to chose randomly?

It depends. If you mean flip a coin, then you should know that no coin flip or dice roll is truely random, it is random to us only because we couldn't predict it with our current technology. This scenario assumes that there are machines in the world that can predict the future, we just don't know whether this particular machine is accurate or not.

Now if you are talking about quantum-based randomness, I mean... I think the machine could just put $0 in the second box just to fuck with you.

[–] falconfetus8@lemmy.world -1 points 2 years ago (2 children)

OP never said there could be a prize in Box A. There's either a prize in Box B, or no prize at all. So there's zero point in taking both boxes.

[–] Joe@kbin.social 3 points 2 years ago

Box A has $1,000,000

Literally the title of the post

[–] GunnarRunnar@kbin.social 2 points 2 years ago

Box A has $1,000,000

load more comments
view more: next β€Ί