From: Lee Corbin (lcorbin@rawbw.com)
Date: Sat Mar 08 2008 - 00:49:58 MST
Nick writes
> Eliezer wrote
>
>> If you flip a fair quantum coin, have your exoself generate 100
>> separated isomorphic copies of you conditional on the coin coming up
>> heads, then, when (all of) you are about to look at the coin, should
>> your subjective anticipation of seeing "heads" be 1:1 or 100:1?
>>
>> This is a question that confuses even me, btw.
>
> If you're trying to maximize the total Bayesian score, and the scoring
> rule counts each copy as an individual, you should guess heads at
> 100:1. Of course, this isn't all there is to it; it feels like "what
> will *actually be experienced* with higher probability?" (not "what
> will I experience", mind; "I" is too vague) is a separate, meaningful
> question. I wonder if it really is.
I agree with the 100:1 answer, by the way. Each separately running
process counts the same. Now I also think it a big mistake to say
before the coin flip, "Oh well, hopefully I'll be the one who doesn't
get tortured (say tails gets tortured), and my odds are pretty good.
It is a big mistake because *both* things will happen to you, even
though we are unused to being in two places at the same time.
> (There's also an ethical question, which may or may not have the same
> answer: is it 100 times as bad for 100 identical people to have
> identical painful experiences as for one person to have one painful
> experience?)
Yes, it is 100 times more tragic than it would be for one person.
Lee
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:02 MDT