Re: Simulation argument in the NY Times

From: Stathis Papaioannou (
Date: Wed Aug 22 2007 - 00:58:45 MDT

On 22/08/07, Matt Mahoney <> wrote:
> --- Stathis Papaioannou <> wrote:
> > I'd be certain that an exact biological copy of me has the same
> > consciousness. I'd be almost certain that a neuron by neuron computer
> > emulation of my brain would have the same consciousness as me (David
> > Chalmer's fading qualia argument). However, I couldn't be certain that
> > some machine designed to copy my behaviour well enough to pass for me
> > would have the same consciousness as me; it might be a p-zombie, or
> > more likely it might just have a completely different consciousness. I
> > would agree to be destructively uploaded in the first two cases, but
> > not the last.
> So you argue that consciousness (defined as that which distinguishes you from
> a p-zombie) depends on the implementation of your brain? Does it matter if
> the neural emulation is optimized by simulating average firing rate as opposed
> to individual pulses. How about simulating the collective behavior of
> similarly weighted neurons with single neurons? How about simulating the
> visual cortex with a scanning window filter? What aspect of the computation
> results in consciousness?
> What if the copy is not exact? You could upgrade your upload with faster
> neurons, more memory, additional senses, finer motor control, a wireless
> internet connection, and so on. At what point does your consciousness not
> transfer?

I would be happy for my upload to deviate from an emulation to the
same extent that I would be happy for my biological brain to deviate
from its present state.

> Suppose the destructive upload consists of a nondestructive exact biological
> copy, followed by you shooting yourself. Would you pull the trigger?

No. I would happily elaborate but have found that these discussions
tend to annoy people who have heard most of it before.

Stathis Papaioannou

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:58 MDT