From: Ben Goertzel (ben@goertzel.org)
Date: Tue Apr 25 2006 - 12:04:50 MDT
> Hmmmm.... I wasn't sure I would go along with the idea that goals are in
> the same category of misunderstoodness as free will, consciousness and
> memory.
>
> I agree that when these terms are used in a very general way they are
> often misused.
>
> BUt in the case of goals and motivations, would we not agree that an AGI
> would have some system that was responsible for maintaining and
> governing goals and motivations?
Well, the semantics of your statement is not entirely clear to me.
Do you mean "Any AGI system must have some subsystem that, upon
observing the system from the outside, appears to be heavily involved
in regulating what appear from the outside to be the system's 'goals'
and 'motivations' " ??
Or do you mean "Any AGI system must have subsystem that contains some
sort of explicit representation of 'goals' and acts in accordance with
this representation"?
These are different things, right?
Novamente happens to have an explicit representation of goals but not
all AGI systems need to, IMO.
> About memory? That
> sounds much more of a real thing than free will, surely? I don't think
> that is a fiction.
Very much of what we feel internally to be "remembered" is actually
"constructed" during the process of so-called "remembering." See
Israel Rosenfield's old book "The Invention of Memory", for example...
Ideas like memory, goals, free will and consciousness are useful
approximative models for discussing intelligent systems but are not
necessarily precise enough to guide AGI design or the detailed
analysis of AGI (or human brain/mind) dynamics.
All these concepts are crude "folk psychology" IMO.
-- Ben
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:56 MDT