Re: Seed AI (was: How hard a Singularity?)

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Wed Jun 26 2002 - 12:08:05 MDT


Ben Goertzel wrote:
>
> Eliezer,
>
> 1) Your statement that "it is impossible to write a story about a character
> smarter than you are" is clearly false, and rather odd.
> Many authors have written good stories about characters smarter than they
> are, and evoked this superior intelligence well. As a single example,
> consider the sci-fi classic "Flowers for Algernon."

As it happens, that *specific example* is used in "Staring into the
Singularity" to bring out the point.

If Charlie Gordon had been as smart as he was depicted as being, he would
have realized that his brain was probably due to burn out without needing to
be bitten by Algernon, and could have understood what was going on before
performing a complex analysis of Strauss and Nemur's surgical technique; he
could have reasoned that since there was nothing about the surgical
technique (as depicted in the story) to prevent it from occurring as a
natural point mutation, and could moreover be applied so universally within
the mammalian line as to work on Algernon as well, it was probably a net
evolutionary disadvantage. Of course it would still be valuable to have the
specific quantitative demonstration that artificially increased intelligence
would deteriorate at a rate of time directly proportional to the quantity of
the increase.

Nonetheless it does demonstrate one of the basic differences between
intelligence and smartness. Daniel Keyes can give Charlie Gordon a dozen
languages as long as they don't appear in the dialogue (or, if necessary,
Keyes could have gotten someone else to produce the dialogue); he can make
Charlie Gordon do lightning calculations in his head as long as the
calculations don't appear in the book or Keyes can do the calculations on a
pen and paper over time; he can have Charlie Gordon invent a calculus of
intelligence as long as it doesn't have to be produced within the story.
Nonetheless Charlie's upper limit on smartness is the smartness of Keyes.
He does the things that Keyes imagines himself doing with Charlie's
abilities, but not the things that a real Charlie Gordon would do. The
events depicted in "Flowers for Algernon" happen to be roughly plausible in
the real world (which is why I selected that example for "Staring into the
Singularity"). In the story Charlie attributes this to a mysterious
neurological effect which is depicted but never explained; within the story
it's hidden behind Charlie's opaque science skillz. The nontechnical reason
I gave, which can be easily explained to a lay reader, is actually much
closer to the heart of smartness than Charlie's madd science skillz - but
for that very reason, Charlie didn't think of it.

Or maybe Keyes did think of it but decided not to include it in the story -
Keyes isn't prohibited from writing a dumber character, just a smarter one.
  The point is that real smartness is often very simple, and for that reason
is far stranger and more powerful than technobabble churned out for the
purpose of conforming to the Spock stereotype - "genius" as a human dressed
up as a computer for Holloween. The reason why I don't like to see people
trying to export our conception of "physical limits" onto the Singularity is
not so much because I expect post-Singularity entities to break the laws of
physics with madd t3chnology skillz (though this is also a possibility) but
because there could be some simple little reason we haven't thought of why
you don't *need* to break the laws.

-- 
Eliezer S. Yudkowsky                          http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT