From: Richard Loosemore (rpwl@lightlink.com)
Date: Tue Apr 25 2006 - 17:57:10 MDT
Ben Goertzel wrote:
>> Hmmmm.... I wasn't sure I would go along with the idea that goals are in
>> the same category of misunderstoodness as free will, consciousness and
>> memory.
>>
>> I agree that when these terms are used in a very general way they are
>> often misused.
>>
>> BUt in the case of goals and motivations, would we not agree that an AGI
>> would have some system that was responsible for maintaining and
>> governing goals and motivations?
>
> Well, the semantics of your statement is not entirely clear to me.
>
> Do you mean "Any AGI system must have some subsystem that, upon
> observing the system from the outside, appears to be heavily involved
> in regulating what appear from the outside to be the system's 'goals'
> and 'motivations' " ??
>
> Or do you mean "Any AGI system must have subsystem that contains some
> sort of explicit representation of 'goals' and acts in accordance with
> this representation"?
>
> These are different things, right?
>
> Novamente happens to have an explicit representation of goals but not
> all AGI systems need to, IMO.
Tricky question. I really only had in mind the former, but....
The notion of an "implicit" goal is not all that clear, in the context.
In the strong sense of implicit, it would be the result of processes
that were not controlling or manipulating it or doing anything directly
to set it .... it would be a slippery side-effect or emergent property.
Thus, in the case of a trivial neural net that recognises patterns and
classifies them into different categories: if someone said that its
"goal" was to find out which category each pattern belonged in, I would
say that this is a trivial and almost meaningless use of the word goal.
So if you are pointing the finger at *this* usage of the word (and
the old chestnut of the thermostat that has goals or beliefs), then I am
100% with you when you say the concept is right up there with free will
and the rest. (Was that what you meant?)
I'm sorry, I'm being a bit slow here: I never even thought that would
be the issue. I was only interested in full blown control systems for
thinking creatures that can reason about the world and have
conversations with you about goal systems -- *those* critters don't have
simple, implicit motivational goal systems, they have massive,
complicated mechanisms that govern how they behave, and lots of the
stuff inside those mechanisms (though not by any means all of it) is
explicit or accessible to the system or both.
Richard Loosemore
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:56 MDT