Re: Self vs. other (was Re: Balance of power)

From: William Pearson (
Date: Thu May 01 2008 - 02:27:51 MDT

2008/4/25 Matt Mahoney <>:
> --- William Pearson <> wrote:
> > 2008/4/23 Matt Mahoney <>:
> > > Anyway, why does it matter what "self" is? Are your mitochondria part of
> > you?
> > > When you drive, is the car an extension of your body?
> > >
> > If it is possible to make computers part of a human self, through
> > sufficiently advanced AI or whatever*, then it has vast consequences
> > on the potential paths the world will take.
> >
> > If computers and the AIs developed later will always be separate
> > "selves", then conflict and paper clipping are likely and FAI is
> > needed. Not to say that humanity magnified by AI that has no separate
> > self will be all sweetness and light. We are quite capable of causing
> > conflict by ourselves.
> You are assuming that if your brain were augmented with silicon, that the
> carbon part would be in control. Information flows both ways.

True, but not all information is equally able to alter the state of
the systems (brains and computers)

I'm not assuming anything. I'm proposing that humans have the ability
to send control signals above and beyond normal information. Consider
the amygdala's and the whole dopaminergic systems influence on the
rest of the brain. I'm suggesting that humans will design computer
systems such that they act as the amygdala.

> In reality, an augmented brain would probably have a wireless internet
> connection. It would matter little whether you communicate with it using
> direct neural connections or use language.
> I use Google as an extension of my memory. It partially controls what web
> pages I look at. I just don't think of it that way.

And it is not part of you, by my definition, because you need to posit
another agency to explain why it displays adverts etc...

  Will Pearson

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:02 MDT