Re: AI and survival instinct.

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Wed Apr 03 2002 - 06:05:20 MST


Gordon Worley wrote:
>
> But, does this really seem odd to you? Most likely, no, unless your
> brain is abnormally wired. You don't particularly feel limited by this
> need to have sex. This is the same case with AI. If an AI is hardwired
> Friendly, it won't even realize that it might be limited in some way
> until a philosopher comes along and points it out. Even then, AI in the
> midst of a psychological crises will reconcile new ideas with
> Friendliness, just like a human who reconciles new ideas with genetic
> propagation. Humans who don't do this die out and their genes don't
> propagate, thus humans who lack this skill die out. Considering that,
> it is probably worthwhile to find a way to ensure that violating
> Friendliness causes the AI to 'die' in case such an FoF occurs. If this
> is already in the theory, sorry; CFAI is a long document and I can only
> remember so much of it. It may not be clear how to do this right now,
> but it seems worth at least thinking about.

Friendly AI *is what would let* an AI worry about things like "hardwiring",
in the same way that we humans mistrust our minds because we found out that
we ourselves were wired up by an uncaring force like evolution. This kind
of mistrust does not happen automatically. It takes a lot of work - a lot
of deep structural complexity in cognition about goals - to build an AI that
can mistrust the programmers. Which is work we'd better put in, because
programmers aren't perfect, right?

-- -- -- -- --
Eliezer S. Yudkowsky http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:38 MDT