Re: How hard a Singularity?

From: Eliezer S. Yudkowsky (
Date: Sat Jun 22 2002 - 16:35:48 MDT

Ben Goertzel wrote:
>> That there is *nothing special* about human-equivalent intelligence!
> Wrong! There is something VERY special about human-level intelligence, in
> the context of a seed AI that is initially created and taught by humans.
> Human-level intelligence is the intelligence level of the AI's creators
> and teachers.
> Our seed AI is going to get its human-level intelligence, not purely by
> its own efforts, but largely based on the human-level intelligence of
> millions of humans working over years/decades/centuries.

Ah. Well, again we have very different models of intelligence. I don't
think you can use human knowledge as the mindstuff of an AI. I don't think
AI can be built by borrowing human content. What we're after is what
produced that content. One is building brainware and, in the early stages,
guiding that brainware into creating content. There may be some
opportunities to nudge the AI into creating better content than she
otherwise would have, but this violates the self-encapsulation rule of seed
AI; an AI doesn't own a thought until she can produce it on her own.
Initially almost all of AI development will violate this rule, of course,
but the measure of how far you've gotten in constructing real, independent
AI, an AI that has her own thoughts and isn't just churning through yours,
is not having to do this any more.

So you look at pouring human content into an AI, and say, "When we reach
human-level, we will run out of mindstuff." And I look at creating AI as
the task of building more and more of that essential spark that *creates*
content - with the transfer of any content the AI could not have created on
her own, basically a bootstrap method or side issue - and say: "When the AI
reaches human level, she will be able to swallow the thoughts that went into
her own creation; she will be able to improve her own spark, recursively."

An AI will have much to learn from human mind-content, but by the time
reaching human-level is anything like an issue, the most important part of
what she knows will belong to her; it won't be borrowed from humans.

(Hm. I think ve/vis/ver was better after all.)

Eliezer S. Yudkowsky                
Research Fellow, Singularity Institute for Artificial Intelligence

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT