Re: AI and survival instinct.

From: Gordon Worley (
Date: Mon Apr 01 2002 - 21:25:46 MST

On Monday, April 1, 2002, at 10:17 PM, Carlo Wood wrote:

> On Mon, Apr 01, 2002 at 09:28:33PM -0500, Gordon Worley wrote:
>> I'll note that even if what we call the mind in humans decides it
>> doesn't care whether it lives or dies, the brain is hardcoded to want
>> to
>> stay alive).
> The reasons that it is so hard for a human to take
> their own life are:
> 1) religion : fear that the action will cause an eternity
> 2) the irreversibility : after killing oneself, one is very
> So unless you mean with 'hard coded' the fact that a humans brain
> is goal driven and the goal is set to becoming happy, I would
> still consider it the act of pure reason not to commit suicide.

We agree. It's just not clear that we do because of some of my
assumptions I lay behind terms.

By brain I am talking about the physical thing the brain. It is a
complex system similar to that in any other animal. It allows for a
wide variety of responses and even the ability to create new responses,
but does not do logical reasoning or other things that did not evolve
into humans. It's an advanced tool for making sure our genes propagate.

By mind I refer to the adaptive part of your brain that is capable of
logical reasoning and other learned techniques that were discovered,
driven by the brain's interpreter (i.e. the part of your brain that is

So, when I say that something like suicide is decided by the mind, I
mean it has to be an act of reason where the mind is able to override
what the brain wants. Of course, if your brain is not normative,
getting your brain to do what your mind wants may be easier or harder
(most likely it will be easier).

Correct me if I'm wrong guys, but based on this terminology, here is the
difference between what Eliezer and Ben think. Eliezer claims that an
AI needs a brain that has the goal of Friendliness on top, hard coded
into place (with the usual exception that the spirit of Friendliness is
what is hard coded, not the letter of what Friendliness means). Ben,
though, thinks that the brain just takes care of very basic stuff and
the mind picks the goals. Or, more accurately, there is an extra goal
layer between brain and mind and this goal layer decides what the mind
can and cannot tell the brain to do, rather than having the brain do its
own mind take-over protection.

> PS2 I have been suicidal several times in the past, so I have
> first hand experienced how hard it is to take ones life, and
> why I didn't do it. And of course, why I shouldn't do it ;).

FWIW, I have OCD and on a regular basis part of my brain decides that
suicide is a good way to get clean but a deeper part of my brain says no
and stops it. Of course, this isn't the kind of suicide that is
generally discussed. While my action would be suicide, it would be
caused because my brain was wired wrong not because I didn't feel like
living. In fact, because I feel like living both deep in my brain and
in my mind, I'm able to suppress that urge. So, anyway, if you ever
hear that I committed suicide (and I have no plans on doing this), it
was because one too many bits flipped in my brain that day, rather than
that I decided that living was not for me. I figure, though, that if I
have managed to not do this in the past 19 years, I should be able to
continue not doing so long as I stay away from things that change how my
brain works.

Also, I'm noticing that a growing number of people on this list have
abnormal brains. I think it may just seem like an exaggerated number of
us are here, though, because it is relevant to cite personal examples
where we have learned how a normative human brain works from our
abnormal brains. Most people have no idea I have OCD until they ask why
I was washing my hands for five minutes or notice that I am very careful
not to step on the cracks on the side walk. ;-)

Gordon Worley                     `When I use a word,' Humpty Dumpty            said, `it means just what I choose                it to mean--neither more nor less.'
PGP:  0xBBD3B003                                  --Lewis Carroll

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:37 MDT