From: Ben Goertzel (ben@goertzel.org)
Date: Tue Apr 25 2006 - 08:23:59 MDT
> I think that the question of an AI's "goals" is the most important issue
> lurking beneath many of the discussions that take place on this list.
>
> The problem is, most people plunge into this question without stopping
> to consider what it is they are actually talking about.
Richard, this is a good point.
"Goal", like "free will" or "consciousness" or "memory", is
1)
a crude abstraction that we use to describe certain aspects of a
complex cognitive system (the human mind/brain), but that does not
really fully describe these aspects correctly
2)
An abstraction that some complex cognitive systems use as an
**ideal**, i.e. they try to modify themselves so as to be more
explicitly goal-oriented, never fully succeeding (at least in the
human case)
When I say that an AGI has certain goals, I don't necessarily mean
that the AGI is orienting all its actions toward these goals, just
that
* the statement that the AGI is "pursuing" these goals is a fair
approximation of its behavior
* the AGI is explicitly involved in an ongoing process of making
itself more goal-oriented so that it can better achieve these goals
which it explicitly conceptualizes
-- Ben G
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:56 MDT