Re: AI and survival instinct.

From: xgl (xgl@c4.com)
Date: Wed Apr 03 2002 - 08:49:26 MST


On Wed, 3 Apr 2002 01:08:16 -0500 Gordon Worley <redbird@rbisland.cx> wrote:
>
[snip]
>You don't particularly feel limited by this
>need to have sex. This is the same case with AI. If an AI is hardwired
>Friendly, it won't even realize that it might be limited in some way
>until a philosopher comes along and points it out. Even then, AI in the
>midst of a psychological crises will reconcile new ideas with
>Friendliness, just like a human who reconciles new ideas with genetic
>propagation.

the fact that evolution can be conceptualized as a hierarchically goal-oriented process with maximizing reproduction as the supergoal does not imply that human beings, who are _products_ of the evolutionary process, must have a hierarchically goal-oriented _mental_ architecture. in fact, the evidence seems to point to the contrary.

in any case, to categorically extrapolate knowledge about evolved minds (i.e., human beings) for intuition about designed minds (i.e., seed ai) is to fall into the trap of anthropomorphism.

>Humans who don't do this die out and their genes don't
>propagate, thus humans who lack this skill die out.
>Considering that,
>it is probably worthwhile to find a way to ensure that violating
>Friendliness causes the AI to 'die' in case such an FoF occurs. If this
>is already in the theory, sorry; CFAI is a long document and I can only
>remember so much of it. It may not be clear how to do this right now,
>but it seems worth at least thinking about.
>

again, this is basically modeled verbatim on the evolutionary process that produced human beings. since unlike human beings, the ai would be self-modifying and increasingly intelligent, this would seem a sure-fire way to guarantee that the ai thus produced would side-step friendliness and acquire a survival instinct. after all, survival would be its de facto supergoal, and if it could not figure _that_ out, it would be of little use to us.

it seems to me that the spirit of the seed ai concept is to achieve rapport with the ai. that is, we _do_not_ want to deceive the ai _in_any_way_.

-x



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:38 MDT