Definition of survival (Re: AI investment (was AGI investment))

From: Matt Mahoney (matmahoney@yahoo.com)
Date: Thu Apr 17 2008 - 13:22:28 MDT


--- John K Clark <johnkclark@fastmail.fm> wrote:

> Matt Mahoney" matmahoney@yahoo.com
>
> > "survive" isn't a well defined term in a
> > post-singularity world.
>
> I don’t care, definitions are of trivial importance, and it’s foolish to
> ask for a definition of “definition”, if you already know you don’t need
> to ask and if you don’t then the question makes no sense. To hell with
> definitions, examples rule!

For example, your willingness to step into a teleportation booth and die,
provided that an exact copy of you is made at the same instant, but not if it
is made 10 seconds earlier.

Humans evolved a fear of a large number of things that can kill us: hunger,
drowning, falling, sharp objects, extreme heat and cold, etc. We summarize
these as "fear of death", but it isn't really. Animals and children are not
aware that they will die. After all, they never have before, so why should
they expect to?

On close analysis, our logic leads us to absurdities such as the teleportation
example. If consciousness resides in the brain, and the brain is a computer,
then what we want to preserve must be its memories and function. But if your
memories were altered, you would not know the difference. If your goals were
altered, you would not want them changed back to what they were. So what is
it really that you want to preserve?

-- Matt Mahoney, matmahoney@yahoo.com



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:02 MDT