Re: ESSAY: How to deter a rogue AI by using your first-mover advantage

From: Vladimir Nesov (robotact@mail.ru)
Date: Tue Aug 28 2007 - 06:12:57 MDT


Tuesday, August 28, 2007, Stathis Papaioannou wrote:

SP> All simulations which provide an identical 1st person POV are
SP> equivalent, and in a significant sense redundant, because it is
SP> impossible even in principle to know which one you are in. If the
SP> number of simulations were suddenly increased or decreased, it would
SP> be impossible for you to know that anything unusual had happened: your
SP> next subjective moment will be in one and only one simulation, any of
SP> the simulations will do as that one simulation, so provided there is
SP> at least one simulation available to choose from you can never know
SP> what's "really" going on.

What then if ALL the simulations are available at least once, through
TM enumerator? Does it mean that ALL other simulators are irrelevant,
as for each simulation there is at least one protected identical simulation (with
identical past and future) running on TM enumerator?

SP> However, if there are two or more competing "next moments" then the
SP> number of simulations is relevant. If there are X simulations in which
SP> you are tortured and Y simulations in which you are not tortured in
SP> the next moment, then you have a X/(X+Y) chance of being tortured.

You can't distinguish between worlds where you will be tortured, and you can't distinguish
between the worlds where you won't. You also can't distinguish between
all these worlds together prior to potential torture time point. Even
if you assume subjective POV is located in individual simulations, you
can't jump to different simulation. All you can do is prepare to
different future possibilities to a different degree, to balance
resources among possible futures. Will your actions be different if
ratio of futures with torture to futures without is 1000:1 or
1:1000? In all 1000 simulations of that or another outcome you will be
a completely identical entity, thinking the same thoughts about the
same events, at the same moments. How is 1000 of identical copies
experiencing torture worse than 1 copy experiencing torture, if it's
the same experience?

To the point of my previous message: how do you count simulations?
What is your solution to that counting paradox I wrote about in previous message?
Does a presence of 2^N simulations within a single implementation
somehow influence probability of you being in certain simulation?

-- 
 Vladimir Nesov                            mailto:robotact@mail.ru


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:58 MDT