Re: Is a Person One or Many?

From: Stathis Papaioannou (stathisp@gmail.com)
Date: Sun Mar 09 2008 - 23:10:21 MDT


On 10/03/2008, Mike Dougherty <msd001@gmail.com> wrote:

> > - If 1000 copies of you are made in London lacking 50% of your
> > memories and 10 copies are made in Paris lacking 5% of your memories,
> > are you more likely to find yourself waking up in London or Paris?
> >
> I guess it depends on what part of your memories you use to identify your
> Self :)

Let's say many memories, aspects of personality etc. are involved in
sense of self, and these are lacking in the proportions as above. Is
your subjective probability of ending up as a particular copy
proportional to the degree of fidelity of that copy? What if the
copying is low in quality but high in quantity? You would expect
*something* since you might expect something after a head injury with
partial memory loss. But what, exactly, if there are multiple copies
involved? I don't think there is a clear answer because the notion of
subjective probability is based on the idea that you are a single
person who persists through time.

> > - You are offered two choices:
> > (a) 100 copies of you are made in London and one copy is made in
> > Paris. The Paris copy is tortured while the London copies are not
> > tortured.
> > (b) 100 copies of you are made in London and one copy is made in
> > Paris. After an hour, the 100 copies in London are tortured while the
> > copy in Paris is duplicated 100,000 times and none of these copies are
> > tortured.
> > Is your subjective probability of being tortured at the moment of the
> > original copying greater in (a) or (b)?
>
> How do we even measure the _degree_ of torture? If I am to identify with
> every copy as myself: scenario A divides my identity 101 ways and only one
> is tortured (resulting in <1% subjective torture, assuming the 100 London
> copies are blissfully happy) Scenario B has a division of my identity into
> 100,100 parts, with 100 being tortured (<0.1% subjective torture) So it
> doesn't seem to matter that scenario B has 100 times as much torture in my
> future because it is statistically overwhelmed by the greater number of
> copies experiencing the Not-torture option.

That's one way to look at it. The other way to look at it is that if
you choose (b) you have a 100-fold greater chance of finding yourself
in London, where torture is certain in an hour, as compared to finding
yourself in Paris, where you will notice nothing at all when you are
duplicated in an hour. To change the example a little, nothing makes a
difference to the subjective probability of being tortured to either
the London copies or the Paris copy if they are able to decide on
finding themselves at either city whether the Paris duplication takes
place, so why should it make a difference at an earlier point?

> > Note that there is no problem *objectively* describing what happens in
> > any of these cases. Lee tries to take the objective point of view and
> > translate it into the subjective point of view as well. But to do this
> > would require a complete overthrow of our notions of anticipation and
> > subjective probability, and I don't think this is possible without
> > rewiring our brains.
> >
>
> I believe the rewiring our brains is part of the assumptions for this
> thought experiment. I assumed the greater "run-time" offered by many copies
> was only really possible in either an uploaded consciousness situation or
> some massively expanded awareness of multiple worlds. In either case,
> rewiring a brain would seem to be the easy part. :)

I am assuming that, though our brains might be rewired, they are
functionally identical, including beliefs about personal identity and
subjective probabilities. Of course if the uploads are designed, for
example, to desire a greater total runtime regardless of subjective
probabilities and anticipation of the future that would change things.

-- 
Stathis Papaioannou


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:02 MDT