Re: [sl4] Re: Uploads coming first would be good, right?

From: Matt Mahoney (matmahoney@yahoo.com)
Date: Fri Feb 20 2009 - 15:05:01 MST


--- On Fri, 2/20/09, Johnicholas Hines <johnicholas.hines@gmail.com> wrote:

> Four points:
>
> 1. Magic consciousness transfer.
>
> Let me say explicitly again: There is no magical consciousness
> transfer. I feel like you're putting words in my mouth, or maybe
> you're arguing against someone else.

You're right, but arguments for uploading always seem to imply a belief in consciousness that somehow depends on implementation. For example, most people would argue that if you replaced your neurons one by one with equivalent circuits until your whole brain was in silicon, that you would never notice the difference. And that this is somehow different than running a program that simulates you and then killing the original. If two programs P and Q produce the same output for every input, then they are effectively equivalent. You can't argue that one is conscious and one is not.

Where we get in trouble is to start with the existence of consciousness as an axiom. This invariably leads to absurdities and paradoxes. Starting instead with the axiom that we are programmed to believe in our own consciousness avoids this problem.

> 2. The friends and family test.

Yes, there is a difference between a machine that appears to simulate you and one that can be proven from the source code that it will always simulate you. The latter might be possible, but Rice's theorem and our experience with software testing seems to make it unlikely to be achieved. An even harder problem is for program P (as opposed to an arbitrary program) to prove that Q(x) = P(x) for all x, given the source code for Q. In other words, given the source code of a program that allegedly models your brain, could you (rather than an expert) prove it actually does?

But let's assume for now it is possible. Should this affect our attitude toward uploading? Or is lots of testing as good as a proof?

> 3. The morality of choosing a compromise.

I believe that people should have the right to make copies of their brains (or not) and then kill themselves, if that is what they want. I am more interested in the question of whether we will want to do so, and what are the consequences?

> 4. Fatalism.
>
> You paint an unpleasant possible future, where healthy humans are
> non-destructively scanned, a wand is waved, and then they are
> murdered. Then the (non-faithful) uploads entice the scannee's friends
> and family into uploading too. However, you don't seem to advocate any
> action toward averting this future. Why not?

Because I don't see any difference between this and the seemingly benign approaches like brain scanning shortly after death. The resulting program is identical. They only seem different if you believe in consciousness. But people are free to make their own choices.

-- Matt Mahoney, matmahoney@yahoo.com



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:04 MDT