Re: How hard a Singularity?

From: James Higgins (jameshiggins@earthlink.net)
Date: Sat Jun 22 2002 - 17:42:07 MDT


Ok, I'm confused. Is Eliezer replying to himself??? If so, could someone
(who knows his physical location) please call the men in white coats?

Thank you,
James Higgins

At 05:33 PM 6/22/2002 -0400, you wrote:
>Eliezer S. Yudkowsky wrote:
> > Michael Roy Ames would have written:
> >
> >> Eli1: What is the mistaken part? I don't see Eli2 drawing an analogy
> >> between chimpanzees and humans... I see him drawing an analogy between
> >> every-other-species-before-now and humans. The point being, once you
> >> can think 'as well as' a human, a large number of things become
> >> possible. Where's the mistake?
> >
> > The mistake is that all every-other-species-before-now was not generally
> > intelligent, whereas humans were.
>
>If I can amplify on what Eliezer says here: The notion is that the shift
>from chimpanzee to human may have been much more of a qualitative,
>system-level shift than the transition from an infrahuman to human-level
>AI would be; consequently a transition which opens fewer doors. To cite
>one concrete difference between our species' respective developmental
>lines, the point at which an AI crosses the line into human-level
>smartness would probably not be the point at which the AI first became
>capable of what we would consider "abstract thinking"... although it might
>be the point at which the AI gained some other, equally valuable ability.
>
>--
>Eliezer S. Yudkowsky http://intelligence.org/
>Research Fellow, Singularity Institute for Artificial Intelligence



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT