Re: [sl4] Uploading (was : goals of AI)

From: Johnicholas Hines (johnicholas.hines@gmail.com)
Date: Tue Dec 01 2009 - 12:27:02 MST


> Suppose there was a program that simulated you so well that nobody could
> tell the difference between you and the program in a Turing test
> environment. What is the probability that the program will be you after you
> shoot yourself?

In order to understand the predicate "will be you" we need to
understand what the consequences of it are - otherwise we're making a
distinction without a difference.

http://en.wikipedia.org/wiki/Distinction_without_a_difference

There's a status quo bias, which might lead us to favor an environment
similar to the present one (no uploading). In order to combat the
status quo bias, we can use Nick Bostrom's reversal test.

http://www.nickbostrom.com/ethics/statusquo.pdf

Suppose that (unbeknownst to all of us), we are routinely duplicated
and one copy is destroyed - some unknown physical process that has
been operating for a long time.

Do you have any desire to change the situation, to eliminate this
duplication-and-destruction? How much desire - what would you give up
in order obtain "continuity" (whatever that means)? If continuity was
actually very preferable to "duplication-and-destruction", then you
would have some intuition that you would be willing to give up
something in order to obtain continuity.

I personally experience almost no desire to eliminate this physical
phenomenon, which leads me to believe that my gut reaction against
duplication-and-destruction is actually irrational, and explainable in
terms of the status quo bias, and maybe an aversion to violence or
suicide.

Johnicholas



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:05 MDT