RE: Zen singularity

From: Ben Goertzel (ben@goertzel.org)
Date: Wed Feb 25 2004 - 09:14:30 MST


The issue of "implicit goal-seeking behavior" is an important one.

For instance, a symmetric Hopfield neural net acts to progressively minimize its energy function.

Is this its goal?

We could say no, because it lacks any explicit representation of this goal -- i.e., it has not reflected on itself, and embodied in itself the pattern "I tend to act so as to minimize my energy function."

Or we could say yes, because it acts AS IF it is pursuing this goal...

-- Ben G

> -----Original Message-----
> From: owner-sl4@sl4.org [mailto:owner-sl4@sl4.org]On Behalf Of
> Christopher Healey
> Sent: Monday, February 23, 2004 1:01 PM
> To: sl4@sl4.org
> Subject: RE: Zen singularity
>
>
> Ben,
>
> I get the impression that many people on this list hear the term
> goal-system and equate it with discrete goal-systems, but also
> loosely include the proximal description of goal-oriented
> behaviors. In other words, the description of the discrete
> goal-system that would encapsulate the behaviors emerging from
> said "complex dynamic systems".
>
> Depending on how strictly one defines this term would seem to
> have a great impact on one's position, since the looser
> definition would seemingly encompass pretty much even basic
> algorithms. For example, without explicitly implementing a
> Bayesian decision network, an intelligence could still engage in
> behavior that strongly approximated one through it's complex
> dynamics. Looking at it from both viewpoints is probably most
> productive, but I'm not sure that there is a strong
> organizational boundary where the goal-system can be said to be
> independent of the implementation. At least where complex dynamic
> systems are concerned.
>
> Unless of course, you meant goal-system to refer only to a
> discrete goal system. Then it's clear. My only real observation
> here is that whenever I hear people arguing over different
> viewpoints on this matter, it seems to be one of semantics, and
> very often the opponents are championing similar views using
> different definitions for key terms.
>
> -Chris H.
>
> > Hmmm... I agree that the concept of "goal-system" is limiting
>
> > and there may be plenty of interesting, complex dynamical
>
> > systems out there that don't have "goal systems" or anything
>
> > like that...
>
> >
>
> > My current working def'n of "intelligence" is "able to
>
> > achieve complex goals in complex environments" but it may be
>
> > there are other useful characterizations of intelligence that
>
> > don't involve goals, or useful definitions of mind that don't
>
> > involve "intelligence" ...
>
> >
>
> > I'm not so sure it's the case that there's only ONE "non-goal
>
> > system", though. There may be many kinds of complex,
>
> > intelligent-ish dynamics whose actions aren't conveniently
>
> > modeled using the language of "goals"
>
> >
>
> > -- Ben G
>
> >
>
>
>



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:46 MDT