cyborgs & ghosts (was Re: AGI motivations (Sidetrack on Uploading))

From: Pope Salmon the Lesser Mungojelly (rainbow@beautywood.org)
Date: Mon Oct 24 2005 - 12:49:15 MDT


On Mon, 24 Oct 2005 12:30:45 -0400, Richard Loosemore <rpwl@lightlink.com>
wrote:

> Remember what uploading would involve. You probably have to know the
> size, shape and possibly the chemical state of every neural cell and
> every one of its connections to all others, and maybe even the state of
> all the synaptic buttons, in three dimensions, in circumstances where
> the darn things are not broadcasting much EM but hiding most of their
> signals in chemical waves.

The whole concept seems a little backwards to me. It seems like it's
predicated on the idea that once we get to such a level of technology,
we're still going to be mostly biological intelligences and we're going to
continue to identify ourselves primarily, if not entirely, with our meat.

Consider first of all that long before we have the technology for
uploading, we are going to have the technology to replace particular
regions of the brain with equivalent (&/or better &/or radically
different) electronic components. At first of course (& this has already
started on a primitive level) this will be sold as a "treatment" for
"diseases." When you get old & your hearing starts to go they'll replace
your auditory system; at first it'll be a crude replacement, then it'll be
comparable to a healthy brain's system, & then it will become dramatically
superior.

Suppose, theoretically, that you were going to upload someone who already
had their native auditory system replaced with an electronic version.
Uploading that part would then consist of a file copy. (I'm not
suggesting that that lowers the difficulty of uploading the rest of the
brain; I'm just trying to get us into a new frame.)

Long before we have the ability to upload, we'll have the ability to
replace even higher/inner parts of the brain. Electronic brain modules
will be sold as "treatments" for people with strokes, for instance. The
result will be people who are very deeply jacked in. It's inevitable that
the new parts we install will have externally modifiable software; long
before anyone is fully uploaded we'll have a planet full of people
"messing" with themselves.

Take another angle. A large part of what makes a person unique is their
memories. Very soon (already, depending on how you look at it) our
electronic memories of our life will be vastly more comprehensive and
accurate than our meat memories. It will become increasingly arbitrary to
identify ourselves more with the memories that our meat presents to us
than with those that our computers do. If we're honest, in fact, it will
be impossible to escape the fact that meat memory is vague, shifty, &
usually wrong; electronic memory will show it up continuously &
dramatically once it gets to the level of "Computer, show me what I was
doing at 2:16 PM on October 24th, 2005."

The electronic parts of us (such as our personal computer hard drives) are
presently rather static, which lends strength to the intuition that they
are "dead." This has been slowly ending for a long time. The predictions
that we would have electronic "agents" which would go out and search for
news articles that interest us have in fact come to pass. They're
tremendously dumb, of course, but that's not going to last. More and more
intelligence will be embedded in the systems around us, and we will
emotionally feel ourselves to be deeply interrelated with that
intelligence.

We are going to have some interesting ghosts. Ghosts is what I call the
machine part of a human-computer intelligence after the human part dies.
If I were to die today, my ghost wouldn't do anything very interesting; it
would continue to collect and sort my email for me, find news articles I
might be interested in, change my desktop background to random pictures,
download my podcasts, etc. Over the next few years, however, ghosts will
become increasingly autonomous. They'll still be useless; they won't hold
up conversations; they'll have their whole purpose structured around
pleasing a central meat processor who no longer exists to care; but
they'll do more & more stuff on their own, setting their own priorities &
keeping themselves busy without outside encouragement.

We'll have to seriously consider: Is it ethical to shut down a ghost? The
user might have started a program with instructions that will keep it busy
for a year. Intentional acts won't lose all of their intelligence just
because they've lost their user. I believe that the locus of selfhood
will start to feel a bit of drift.

Most people identify themselves pretty strongly with their brain as a
whole, but not so much with each individual module. Who's to say that we
won't "upload" by simply replacing parts that seem too small to be "me"
until there's no meat left?

I think uploading sounds like a fun idea, but I think it will come on the
back side of the curve. Long before then we will be personally joining
with vast intelligences. So who knows if it will even make the same sort
of sense to us by the time it becomes possible.

<3,
mungojelly



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:52 MDT