Re: Happy Box

From: Matt Mahoney (
Date: Sun May 04 2008 - 11:29:11 MDT

--- John K Clark <> wrote:

> An agent that discovers new information but refuses to change its
> goals
> merely (Merely!) because it is a means to an end, that is to say it
> wonít change its goal system even though it now knows the old
> structure
> wonít work but a new structure will, is about as intelligent as a
> rock.
> And this is how you expect to make a super intelligent slave? Good
> luck, youíll need it.

I discovered that my goal system won't work. For example, I am not
happy if I don't eat, or if I am too hot, or too cold, or poked with
sharp objects. I want to change my goal system so I am happy all the
time no matter what happens. I know this is possible because I did
simulations with with the
second and third arguments both positive. Will this make me more

-- Matt Mahoney,

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:02 MDT