From: Dimitry Volfson (email@example.com)
Date: Mon Aug 22 2005 - 16:51:23 MDT
I suppose that I am thinking more about conceptual systems. For example,
without a concept of "reverse" on a vehicle, roads and driveways would
look different. An intelligence has to find ways of economically
expanding its conceptual systems. Because you could never move a vehicle
backward (by the vehicle's own power) by only pushing the brake and the
Goedel concerns himself with fixed systems, not systems that can learn
new (basic) concepts without becoming a different system.
I mean, what is a system? A set of concepts and the rules those concepts
When the agent consults the oracle of reality by conducting reality tests
(experiments), the agent has the potential to learn new (for it)
On Mon, 22 Aug 2005 15:02:24 -0700 (PDT) Phil Goetz <firstname.lastname@example.org>
> --- Dimitry Volfson <email@example.com> wrote:
> > Doesn't Goedel only show that attempting to say something
> > about the
> > system, within the system; can lead to paradox? But a
> > thing cannot always
> > be in meta-relation to itself. Is it really difficult to
> > determine when
> > this occurs, and to shift to a higher level system?
> The agent doing the determining will be operating within
> some system. It cannot shift out of that system; that
> would be the same as asking a Turing machine to emulate
> something more powerful than a Turing machine.
> However, an agent interacting with the real world has an
> "oracle" function that it can provide inputs to, and then
> get outputs from that were not computed within that system.
> Hence, Goedel's incompleteness theorem doesn't apply.
> - Phil
> Do You Yahoo!?
> Tired of spam? Yahoo! Mail has the best spam protection around
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:52 MDT