From: Michael Roy Ames (michaelroyames@hotmail.com)
Date: Sat Jun 22 2002 - 17:39:18 MDT
Woah!  Eliezer recursion!  Has the singularity happened already?  How was
it?
Michael Roy Ames
----- Original Message -----
From: Eliezer S. Yudkowsky <sentience@pobox.com>
To: <sl4@sysopmind.com>
Sent: Saturday, June 22, 2002 1:03 PM
Subject: Re: How hard a Singularity?
> Eliezer S. Yudkowsky wrote:
>  >
> > Eliezer S. Yudkowsky wrote:
> >  >
> >  > By hypothesis, the AI just made the leap to human-equivalent
smartness.
> >  > We know from evolutionary experience that this is a highly
significant
> >  > threshold that opens up a lot of doors.  Self-improvement should be
going
> >  > sixty at this point.
> >
> > Eliezer, I think you are mistaken about this.  The shift from chimpanzee
> > to human was probably nothing like the shift from infrahuman to
> > human-equivalent AI will be.  The two development trajectories are too
> > different to draw analogies between them.  In particular, the sudden
> > invention of "general intelligence" supported on chimpanzee substrate
> > may have no analogue in the development path of an AI.
>
> Conceded, Eliezer.  However, I still think that jumping to
> "human-equivalent" smartness should be worth something in opened doors -
in
> terms of which parts of the system are self-understandable if nothing
else.
>   (I.e., the AI was designed by humans, therefore human smartness is a
> significant internal threshold.)  But I could be mistaken, and at any rate
> you are correct that we know nothing "from evolutionary experience".
>
> --
> Eliezer S. Yudkowsky                          http://intelligence.org/
> Research Fellow, Singularity Institute for Artificial Intelligence
>
>
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT