From: Eliezer S. Yudkowsky (firstname.lastname@example.org)
Date: Wed Feb 27 2002 - 12:04:26 MST
Michael Roy Ames wrote:
> Eliezer wrote:
> > I can't speak for Ben, but certainly SIAI does not advocate (and never has)
> > the explicit codification of knowledge distilled from human experts.
> > Knowledge is learned complexity; it is abstracted from experience, not
> > hardwired by the programmers.
> There *must* be some explicitly codified knowledge in a Seed AI, at least
> enough to get it started... else, how does it think about anything? I
> imagine that, once a Seed AI started to learn 'on its own', then its learned
> knowledge base would quickly become a lot larger than it's pre-entered data.
Knowledge isn't code; to explicitly create knowledge you'd have to create
tools, working on the same principles as the AI's present or future
cognitive subsystems, which would let you craft the same kind of content
that the AI expects to find internally.
I'd expect crafted knowledge to lack the useful richness of genuinely
abstracted knowledge. The next step is programmer-driven tutoring, in which
you place the AI in virtual environments containing challenges that the AI
can only solve by inventing new strategies, abstracting new concepts, or
inducing new beliefs. That's the point at which "real" knowledge begins
entering the system.
Independent learning would come after that.
-- -- -- -- --
Eliezer S. Yudkowsky http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:37 MDT