this post was submitted on 22 Jun 2023
20 points (91.7% liked)

Asklemmy

44149 readers
1446 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy ๐Ÿ”

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~

founded 5 years ago
MODERATORS
20
Deleted (lemmy.dbzer0.com)
submitted 2 years ago* (last edited 1 year ago) by IsThisLemmyOpen@lemmy.dbzer0.com to c/asklemmy@lemmy.ml
 

Deleted

you are viewing a single comment's thread
view the rest of the comments
[โ€“] TauZero@mander.xyz 3 points 2 years ago

Here's my solution to Newcomb's Paradox: the predictor can be perfectly infallible if it records your physical state and then runs a simulation to predict which box you'll pick. E.g. it could run a fancy MRI on you as you are walking through the hallway towards the room, quickly run a faster-than-real-time physical simulation, and deposit the correct opaque box into the room before you open the door. The box, the hallway, the room, the door are all part of the simulation.

Here's the thing: a computer simulation of a person is just as conscious as a physical person, for all intents of "consciousness". So as you are inside the room making your decision, you have no way of knowing if you are the physical you or the simulated you. The predictor is a liar in a way. The predictor is telling the simulated you that you'll get a billion dollars, but stating the rules is just part of the simulation! The simulated you will actually be killed/shut down when you open the box. Only the physical you has a real chance to get a billion dollars. The predictor is counting on you to not call it out on its lie or split hairs and just take the money.

So if you think you might be in a simulation, the question is: are you generous enough towards your identical physical copy from 1 second ago to cooperate and one-box? Or are you going to spitefully deprive them of a billion dollars by two-boxing just because you are about to be killed anyway? Remember, you don't even know which one you are. And if you are the spiteful kind, consider that we are already making much smaller time-cooperative trade-offs all the time, such as the you-now taking a breath just so that the you-five-seconds-from-now doesn't suffocate to death.

What if the predictor doesn't use a MRI or whatever? I posit that whatever prediction method it uses, if the method is sufficiently advanced to be infallible then somewhere in the process it MUST be creating conscious observer instances.