Re: Goals, but not football (was re: Happy Box)

From: Matt Mahoney (matmahoney@yahoo.com)
Date: Mon May 05 2008 - 09:11:45 MDT


--- Stuart Armstrong <dragondreaming@googlemail.com> wrote:

> Similarly, an AI with a goal like "ensure that every human being
> survives, is happy and feels worthwhile, and beyond that obey human
> instructions", would probably not stumble towards paralysis or rote,
> even if that goal remains forever unchanged.

But could become dangerous as its definitions of "human", "survives",
"happy", "worthwhile", and "obey" changes.

- Is an embryo human? an upload? a robot?

- Does an upload count as surviving? Does having the capacity to
reproduce a simulation of your life count as surviving?

- Is wireheading happiness? A lifetime of drug induced euphoria?

- Is a high score in a video game worthwhile? Acquiring property in
Second Life that becomes your first life?

- Does "obey" include changing your goals so that you want to be a
paperclip making nanobot, and the AI grants your wish?

If no, how do you propose to program the goal system so that these and
a million other unintended consequences that you didn't anticipate
don't happen?

-- Matt Mahoney, matmahoney@yahoo.com



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:02 MDT