From: Ben Goertzel (ben@goertzel.org)
Date: Sun Jun 09 2002 - 20:18:44 MDT
I would phrase it this way, instead.
Given a set of *basic goals*, rational *subgoals* are those that result from
the basic goals by some logical thought process.
An organism's basic goals are not logically derived, at least not in
biological organisms. They are part of an organism's evolutionary
birthright, modulo mutations. They may be considered partially "logically
derived" via the logic of evolution, but there's a lot of
non-selection-based self-organizationg going on here too, in my view.
In the future, goals may be logically derived in a different sense. An AI
may create new goals for itself, based on its previous goals among other
things. (For instance, uploaded-Ben might decide to remove "personal
survival" as an important goal for itself, thus radically altering its goal
system.) This process of iterative basic-goal-revision depends on initial
conditions in a way we don't yet understand. Maybe the trajectory thru
basic-goal-space depends sensitively on the initial condition, or MAYBE no
matter what the in initial condition, the system will eventually converge on
a small handful of "fixed point" basic goal systems G with the property "A
system with basic goal system G, when revising itself, will preserve basic
goal system G." Eliezer's hypothesis is that the Friendliness goal system
is a fixed point basic goal system in this sense.
-- Ben G
> Rational goals are those that result from some logical thought process.
> For example (highly simplified): Eliezer sees the Singularity coming;
> Eliezer sees that the Singularity could potentially wipe out everyone
> and everything if it goes slightly wrong; ergo, Eliezer dedicates his
> life to creating a safe Singularity. Irrational thinking would be if
> Eliezer had thought like this: Eliezer sees the Singularity coming;
> Eliezer sees that the Singularity could potentially wipe out everyone
> and everything if it goes slightly wrong; Eliezer sees an attractive
> woman; Eliezer goes off with the woman and lets the Singularity be a
> hobby. Even worse, he might have rationalized: Eliezer goes off with
> the woman because he thinks that this experience will help him create a
> safer Singularity. Without the right set of qualifiers on the woman,
> he'll be thinking this because his brain is tricking him into doing what
> it wants, i.e. reproducing.
>
> Rational goals don't just come out of no where, they follow logically
> from your knowledge. Irrational goals don't follow logically and
> probably seem to come up out of nowhere (oh, I think I'd like vanilla
> ice cream today!).
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT