Re: The role of consciousness

From: Lee Corbin (lcorbin@rawbw.com)
Date: Mon Apr 07 2008 - 23:45:40 MDT


Jeff writes

> Lee wrote:
>
>> By the way, Jeff mentioned that it would be cruel and
>> unethical to not pay the $2, but we get to the heart
>> of the problem by supposing---of course, entirely
>> hypothetically---that Matt (or whoever is speaking,
>> i.e., the subject of the query) is what I have called
>> an MSI (a most selfish individual). We wish to ask
>> what an MSI would do, and what are the logical
>> reasons one way or the other for his choices, given
>> that the MSI values existing.
>
> I think the question of what a "most selfish" individual would do is
> not well-defined, since the self is not well-defined.

But we *do* have an idea of what selfish behavior is,
especially in this particular instance. By expressing
(and really, we assuming *acting* as though you have)
concerns for whether or not someone you don't know
in Africa gets tortured, you clearly are not selfish in
that regard. And if you refused to call someone who
was completely indifferent to the African's suffering
"selfish", then your use of terms would be at variance
with other speakers of English.

> It all depends on which future copies of you
> you regard as your "future self", which
> is purely a language convention.

But surely you don't really believe that it's only a
matter of "language convention". Suppose the
Gestapo arrived at your doorstep and proceeded
to set up the apparatus to torture you, but the
Colonel in charge had a philosophic turn of mind,
he might say, "You really should not appear so
apprehensive Mr. Jones---it's purely a language
convention as to whether you or someone else
is going to shortly undergo suffering." Do you
really expect us to believe that it's identically the
same to you whether they do it to you or do it
some random African? You really don't believe
that there is a fact of the matter?

To add on, suppose that you love X, and right
before an extremely painful operation, you express
sympathy for X and tell her so. Would you be lying
if you then added "of course, whether it's you who
undergoes the pain or some random African is
actually indifferent to me"? Or would that be the
truth (however hurtful the statement itself might be)?

> To add to Matt's original thought experiment, here is
> a chain of related thought experiments I find interesting:
>
> 1. Is it ethical to use a "date rape" drug on someone who isn't going
> to remember much in the morning, but will be unable to resist your
> advances during the night?

I say that it is not, because whether or not an individual
remembers something is immaterial to the physical
reality of whether or not it happened to him or her.

> 2. Is it ethical to rape someone and then erase their memories
> completely (or fix them afterwards so that they believe it never
> happened, or they have a blackout)?

Likewise, it is not ethical (by which, as I have said, I mean
little more than "I disapprove" or "we customarily disapprove
or we should disapprove").

> 3. Is it ethical to torture someone and then erase their memories of it?

Likewise. While it's bad to *add* memories of having been
tortured, it is much worse to actually torture someone (leaving
their memories as usual).

> 4. Would you be upset if someone started torturing you and then said
> "don't worry, I'll erase your memory later."

:-) There isn't anyone who would fail to be upset, if
we are all using the same meaning of the word "torture",
because the phrasing indicates that the torturer is may
be committing his hideous acts based upon a simple
misunderstanding or misapprehension.

> 5. Would you be upset if someone started torturing you and then said
> "don't worry, I've made a backup of your brain and I will use it to
> make an exact copy of you on my home planet, as soon as I figure out
> how to do so... which should only take a few years. I'm not going to
> torture your copy when it wakes up, so it's fine if I torture you now.
> In fact, I'm going to kill you soon so it won't matter at all. Your
> copy won't remember any of this."?

Naturally, that would be terrible. The proper way to look at it
is to integrate (or, trying to be less fancy, *sum*) over all
instances of your runtime, and try to evaluate preferences
accordingly.

> 6. Would you be upset if someone started torturing you and then said
> "don't worry I've *already* made a copy of you, so you are redundant".
>
> 7. Would you be upset if someone started torturing you and then said
> "don't worry, I've already made a copy of you *and* I'm going to erase
> your memories soon, so you won't be you at all but the copy will be."
>
> 8. Would you be fine with such torture happening to you, as long as a
> copy was made somewhere else, and as long as you consented to it
> beforehand?
>
> I think number 8 is essentially the question that Matt is asking...

I think that I agree entirely with you and the point you're making here.

> but I don't really see how 8 is different from 7, 7 is different from
> 6, etc. I would answer "no" to all of these for the same reason.
> Being tortured would upset me... I wouldn't want anyone to do it to
> me, and I wouldn't want to do it to anyone else (including my
> "original" future self) regardless of how many copies of me were made
> and when.

I agree completely. But our point of view implies, at least to me,
that we regard all our copies equally, on the same ontological
footing as it were. "Bad runtime" + "good runtime" is either bad
or good depending on which of the two is greater quantitatively.

Lee



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:02 MDT