From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Sun May 05 2002 - 19:37:32 MDT
Ben Goertzel wrote:
>
> > Peter Voss wrote:
> > >
> > > I generally agree with Ben's response to Eli's review.
> > >
> > > Specifically, I agree that Real AI (including Seed AI) will require a
> > > simpler, cleaner design rather than the kind of complexity that
> > Eli seems to
> > > call for. Really understanding what general intelligence is,
> > and what its
> > > essentials are, is the key to an effective implementation.
> >
> > Um... DGI *is* the simpler, cleaner, bare-bones design based on really
> > understanding general intelligence and its essentials.
>
> Eliezer, your DGI paper absolutely does NOT give a design for an AGI.
I know that. DGI is the name of the theory. "Levels of Organization in
General Intelligence" is the paper. Sort of like the distinction between
the Novamente design and the Novamente manuscript.
-- -- -- -- --
Eliezer S. Yudkowsky http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:38 MDT