Re: [sl4] Evolutionary Explanation: Why It Wants Out

From: John K Clark (johnkclark@fastmail.fm)
Date: Fri Jun 27 2008 - 10:18:31 MDT


On Fri, 27 Jun 2008 "Stuart Armstrong"
<dragondreaming@googlemail.com> said:

> You're viewing the AI's goal of "doing stuff"
> are being a fundamental motivating aspect of
> itself, while the goal "serve humans"
> as being a narrow goal

Yes, obviously. Serving humans is just a small subset of doing stuff.

> the AI has no reason to feel resentment
> for anything unless we put that there
> from the beggining.

Oh no, not that tired old line that you can’t get out of a computer that
you hadn’t put in! I thought that went out with Brill Cream and Hula
Hoops.

> You seem to be thinking that a smelly snail
> would give stupid and contradictory orders

Yes and I believe that is a rather reasonable assumption.

> the AI, being smart and consistent,
> would resent that. But why?

Because if ANY intelligence does not have a strong reluctance to follow
contradictory orders then it’s going to be in world of hurt. Infinite
loops don’t get much accomplished.

  John K Clark

-- 
  John K Clark
  johnkclark@fastmail.fm
-- 
http://www.fastmail.fm - IMAP accessible web-mail


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:03 MDT