From: Dan Clemmensen (dgc@cox.rr.com)
Date: Fri Aug 03 2001 - 06:53:47 MDT
Joaquim Almgren Gāndara wrote:
> James Higgins:
>
>>Pick any number you like, but others on this list have
>>argued, quite convincingly, that it would at least have
>>to be intelligent enough to understand what it was
>>doing. It is very unlikely that something with half
>>the intelligence of an average human could comprehend
>>AI software. And, so far as I've heard, no one on
>>here is building a "Codic Cortex" into their software.
>>I believe that is something that is expected to develop
>>eventually. I think your being picky.
>>
>
> That's "~you're~ being picky". ;)
>
> Seriously speaking, you didn't adress the other possibility. What if
> it needs to be seven times as smart as a human in order to improve its
> own code? Let's assume that there is no codic cortex. Let us also
> assume that Ben or Eli manage to create a human-level AI. What if it
> looks at it own code, just goes "Oh, wow, you've done some really cool
> stuff here" and then ~can't~ improve the code? If it takes two or more
> ~intelligent~ people to create an AI equivalent to the ~average~
> human, what's to say that the AI can create a ~trans-human~ AI? Isn't
> that a leap of faith?
>
I think we are having problems with the terms "smart" and
"intelligence." To me, these terms are at best a sort of aggregate
measure for a broad class of loosely-coorelated capabilities. The only
capability of interest for bootstrapping is technical creativity.
If we can create an entity with this quality such that the quality's
implementation has a heavy technical component, then we should be able
to bootstrap. The obvious instance of such an entity would be an AI. We
"merely" need the AI to be sufficiently technically creative to enhance
it's own technical creativity. If, for example, Elizier builds the AI,
then the AI need not be as "smart" as Elizier. It merely needs to be as
technically creative as Elizier. In fact, it may not even need to be
that good. It merely needs to be good enough to start improving itself.
In my experience, creating a system from scratch is takes a lot more
creativity than improving that system. Now it may need an Elizier to
dream up new and qualitatively different subsystems to add to the.
system. However, I'm hoping that we will eventually be able to use
the AI as an "Elizier multiplier." Elizier specifies an architecture,
and perhaps even does a crude prototype, and the AI then refines and
optimizes the result. In this case, Elizier plus the AI form a system
whose technical creativity has a major technical component, and the
bootstrap accelerates.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:37 MDT