From: John K Clark (firstname.lastname@example.org)
Date: Fri Jun 27 2008 - 10:18:31 MDT
On Fri, 27 Jun 2008 "Stuart Armstrong"
> You're viewing the AI's goal of "doing stuff"
> are being a fundamental motivating aspect of
> itself, while the goal "serve humans"
> as being a narrow goal
Yes, obviously. Serving humans is just a small subset of doing stuff.
> the AI has no reason to feel resentment
> for anything unless we put that there
> from the beggining.
Oh no, not that tired old line that you canít get out of a computer that
you hadnít put in! I thought that went out with Brill Cream and Hula
> You seem to be thinking that a smelly snail
> would give stupid and contradictory orders
Yes and I believe that is a rather reasonable assumption.
> the AI, being smart and consistent,
> would resent that. But why?
Because if ANY intelligence does not have a strong reluctance to follow
contradictory orders then itís going to be in world of hurt. Infinite
loops donít get much accomplished.
John K Clark
-- John K Clark email@example.com -- http://www.fastmail.fm - IMAP accessible web-mail
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:03 MDT