Re: Goals, but not football (was re: Happy Box)

From: Stuart Armstrong (dragondreaming@googlemail.com)
Date: Wed May 07 2008 - 07:19:05 MDT


> > There's no need for fixed goals to lead to stasis
>
> Right, and for similar reasons a fixed set of axioms can be used to
> derive all true mathematical statements. Oh wait, that was proven to be
> imposable.

If we are going down the formal mathematics rout, you need to show
that Godel's theorem applies in this context (GLUT's do not obey the
incompleteness theorem, and yet they are arguably equivalent with
human behaviour; if, as seems believable, human minds can only occupy
a finite set of truly different states, Godel's theorem need not
apply).

> > "being the best conceptual artist" is a
> > very nice super-goal to have
>
> And when you were 5 your "super-goal" was to be a cowboy. I don't know
> what your "super-goal" will be tomorrow and neither do you.

As I sometimes forget, humans and AI need not be similar in any way.

> Any fixed goal would lead to stagnation, even a goal that was not as
> incredibly foolish and immoral as the above.

Fixed goals and fixed universes could lead to stagnation, but I don't
see how a fixed goal in a changing universe is likely to lead to
stagnation at all - and the more entwined the goal is with the changes
of the universe, the less stagnation.

Stuart



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:02 MDT