Re: Deadly Sins of Real AI

From: Eliezer S. Yudkowsky (
Date: Tue Apr 02 2002 - 19:29:26 MST

ben goertzel wrote:
> I think that I have clarified my goals for you many times. First a
> roughly-human-equivalent intelligence that can be taught and conversed
> with, then (growing out of this) a self-modifying intelligence that can
> bootstrap its way to tremendously superhuman intelligence. If successful,
> this will lead either to the Singularity or -- perhaps, I hope not! -- to a
> system that understands the hidden obstacles to the Singularity that we
> stupid humans are missing...

Are you sure you really mean "human-equivalent" in the paragraph above? As
you know, I've spent some time thinking about epochs in AI intelligence, and
I would suggest the following terminology:

  Tool-level AI: The AI's behaviors are immediately and
directly specified by the programmers, or the AI "learns"
in a single domain using prespecified learning algorithms.

  Prehuman AI: The AI's intelligence is not a significant
subset of human intelligence. Nonetheless, the AI is a
cognitive supersystem, with some subsystems we would
recognize, and at least some mind-like behaviors. (A
toaster oven does not qualify as a "prehuman chef"; a
general kitchen robot might do so.)

  Infrahuman AI: The AI's intelligence is, overall, of the
same basic character as human intelligence, but
substantially inferior. The AI may excel in a few domains
where it possesses new sensory modalities or other
brainware advantages not available to humans. Humans
talking to the AI usually recognize a mind on the other
end. (An AI that lacks the ability to communicate and
model external minds does not yet qualify as infrahuman.)

  Near-human AI, human-equivalent AI: The AI's
intelligence is in the rough neigborhood of a human's. It
may be locally inferior or superior in various domains,
but general intelligence, reasoning ability, and learning
ability are roughly that of a human.

  Transhumanity. "Weak" transhumanity is intelligence which
reasons like a human but at a much higher speed. "Strong"
transhumanity implies greater functional complexity and
the ability to think thoughts uninventable by humans or
manipulate concepts which are humanly incomprehensible.

It actually goes up from here to "superintelligence" and "Power", but who
cares. Anyway, I would expect first steps toward seed AI to become possible
at the prehuman or infrahuman level, depending on the approach and the kind
of self-improvement being attempted. When you say "human-equivalent" in
your original statement, do you mean what I would call "infrahuman", i.e.,
intelligence of roughly the same character but substantially inferior in
terms of actual capabilities?

-- -- -- -- --
Eliezer S. Yudkowsky
Research Fellow, Singularity Institute for Artificial Intelligence

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:37 MDT