Re: Separate Copies Contribute Separately to One's Runtime

From: Nick Tarleton (nickptar@gmail.com)
Date: Fri Mar 07 2008 - 13:30:39 MST


On Fri, Mar 7, 2008 at 2:52 PM, Eliezer S. Yudkowsky
<sentience@pobox.com> wrote:
>
> Nick Tarleton wrote:
> > On Fri, Mar 7, 2008 at 9:23 AM, Lee Corbin <lcorbin@rawbw.com> wrote:
> >> No, as causally separate process, each is separately conscious,
> >> even if isomorphic (according to me).
> >
> > What's the difference in anticipated experience between this and them
> > not being separate? (
> > http://www.overcomingbias.com/2007/07/making-beliefs-.html )
>
> If you flip a fair quantum coin, have your exoself generate 100
> separated isomorphic copies of you conditional on the coin coming up
> heads, then, when (all of) you are about to look at the coin, should
> your subjective anticipation of seeing "heads" be 1:1 or 100:1?
>
> This is a question that confuses even me, btw.

If you're trying to maximize the total Bayesian score, and the scoring
rule counts each copy as an individual, you should guess heads at
100:1. Of course, this isn't all there is to it; it feels like "what
will *actually be experienced* with higher probability?" (not "what
will I experience", mind; "I" is too vague) is a separate, meaningful
question. I wonder if it really is.

(There's also an ethical question, which may or may not have the same
answer: is it 100 times as bad for 100 identical people to have
identical painful experiences as for one person to have one painful
experience?)



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:02 MDT