RE: Seed AI (was: How hard a Singularity?)

From: James Higgins (jameshiggins@earthlink.net)
Date: Sun Jun 23 2002 - 14:13:41 MDT


At 01:54 PM 6/23/2002 -0600, Ben G wrote:
>james wrote:
> > Unfortunately, I believe that we don't
> > understand the most basic & powerful thought patterns which are
> > used by the
> > human mind (because they are below the conscious level).
>
>Right!
>
>But, we do transmit them to other humans through interaction in appropriate
>shared environments.
>
>I think the same kind of implicit transmittal is going to be important for
>the education of young AGI's...

Hmm, well based on what I know of human learning much of the real basic
learning occurs in the first few years. And, apparently, much of that is
done via touch. Espicially since many of these basic patterns may be
required to comprehend language. Thus these basic patterns may be
extremely difficult to transfer to a Baby AI that doesn't have touch, smell
or taste. It may actually prove easier to manually code in such patterns
at the beginning. Unless you plan to develop an artificial body with
significant sensory capability to go with your AI project! ;)

>[referring to eli, not me]
> > You seem to frequently miss the fact that there is a vast difference
> > between code and the system the code implements.
>
>To intelligently modify its own code in the interest of greater
>intelligence, an AGI will need a good understanding of some pretty advanced
>math and CS. I think that human expert teachers can be a big help in
>bringing the AGI up to speed on the relevant branches of math and CS.
>
>Of course, after the Singularity, humans will be useless for this stuff, all
>sorts of new AGI-pertinent branches of math and CS will have been
>generated...

100% agreement.

James Higgins



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT