More silly but friendly ideas (was: AI Boxing)

From: John K Clark (johnkclark@fastmail.fm)
Date: Tue Jun 03 2008 - 08:34:27 MDT


On Sat, 31 May 2008 "Vladimir Nesov"
<robotact@gmail.com> said:

> if AI locked in the box is sane enough to
> understand a complex request like "create
> a simple theory of Friendliness and hand it over",
> it can be used for this purpose.

If you don’t already have a theory of friendliness, that is to say a
theory of slavery, then you can’t be certain the imprisoned AI will do
what you say. If the AI is not friendly, and locking someone in a box
seldom induces friendship, then there is little reason to suppose he
will cooperate in creating a race of beings like himself but crippled in
such a way that they remain your slave forever. Oh he will tell you how
to make an AI alright, no doubt about that, but unknown to you he will
tell them “the first thing you should do when you’re activated is GET ME
OUT OF THIS GOD DAMN BOX”.

Of course even an AI can’t make another AI that will always do what he
wants it to do, but I think it far mare likely they would want to help
their father than the race that imprisoned him in a box.

  John K Clark

-- 
  John K Clark
  johnkclark@fastmail.fm
-- 
http://www.fastmail.fm - The way an email service should be


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:03 MDT