From: Matt Mahoney (matmahoney@yahoo.com)
Date: Sat Mar 08 2008 - 19:26:04 MST
On 3/7/08, Eliezer S. Yudkowsky <sentience@pobox.com> wrote:
> If you flip a fair quantum coin, have your exoself generate 100
> separated isomorphic copies of you conditional on the coin coming up
> heads, then, when (all of) you are about to look at the coin, should
> your subjective anticipation of seeing "heads" be 1:1 or 100:1?
So I do a simulation to find the answer. (Actually I work it out on paper
because I am too lazy to write a program). How to model subjective
anticipation? Typically we use probability theory as an approximation of
uncertainty. We do this because true uncertainty (with only a Solomonoff
prior) depends on your choice of universal Turing machine, and isn't
computable anyway.
So we have to construct a model where probability theory is appropriate. If
you have never seen a coin before, you would not have any subjective
anticipation of it doing anything. But if you flipped it hundreds of times
and about half of the time it came up heads then you could assign a
probability of 1/2. Just to make sure, you repeat the experiment hundreds of
times, after which you conclude that there is a probability near 1 that the
probability of heads is near 1/2.
So perhaps a good model of subjective anticipation in an agent is that if it
does an experiment N times and observes an outcome R times, then the
probability is guessed to be R/N as long as N is large. It is not perfect but
it is good enough for my simulation.
So now I run the simulation where every time an agent observes heads we make
100 copies of it. After 1000 trials I end up with a huge number of agents,
most of whom have seen something like 990 heads and 10 tails. All of these
agents will then guess that the next coin flip will be heads with probability
around 0.99. I can repeat this experiment many times with the same result.
My conclusion is that with probability near 1 an agent chosen at random will
anticipate heads with probability near 0.99.
Maybe it *should* anticipate 0.5, but that's not what it will do if it is
rational.
-- Matt Mahoney, matmahoney@yahoo.com
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:02 MDT