RE: How hard a Singularity?

From: Ben Goertzel (ben@goertzel.org)
Date: Tue Jun 25 2002 - 14:01:38 MDT


one further comment...

Actually, I *know* from many comments others have made to me, that I am not
the only one to have gotten the impression of your overconfidence on points
a) and b) from your various communications

Therefore, if I have misinterpreted your real attitude, Eli, I think the
fault is not mine alone, but lies to some extent in the way you express
yourself...

However, in spite of my sometime difficulties with your apparent
overconfidence in your speculative ideas, I continue to find you an
intriguing thinker and a fascinating correspondent!

ben g

> -----Original Message-----
> From: Ben Goertzel [mailto:ben@goertzel.org]
> Sent: Tuesday, June 25, 2002 1:58 PM
> To: sl4@sysopmind.com
> Subject: RE: How hard a Singularity?
>
>
>
> Eliezer,
>
> I'm sorry if I've misinterpreted your statements over the past
> couple years...
>
> In our various interactions, you have seemed to me to display a
> HUGE confidence that
>
> a) you personally are somehow uniquely suited or even "destined"
> to play a key role in bringing the Singularity about
>
> [others have gotten this impression from you as well; I recently
> received a personal e-mail in which someone else referred to (his
> wording) your idea that you are "The One"]
>
> b) your approach to Friendly AI is the right way to ensure the
> Singularity comes out well
>
>
> My impression has been that your confidence in a) and b) is at
> least a little excessive -- enough so to make me a bit
> uncomfortable sometimes.
>
> However, my subjective impressions of other human beings are not
> always accurate, and apologies are due if my impressions have
> been inaccurate.
>
> I hasten to add that I am far from a perfectly rational being
> myself, and am surely just as full of flaws as any other human
> (just ask my wife ;), I'm not trying to be "holier than thou" here...
>
> -- Ben G
>
>
>
>
> > Don't worry. My alleged "self-confidence" is Ben's invention. I
> > happen to
> > be fairly confident that many of Ben's theories are wrong; it's
> > not at all
> > the same as being confident that my own theories are right.
> >
> > Nobody who thought the Singularity was understandable would ever have
> > invented Friendly AI.
> >
> > --
> > Eliezer S. Yudkowsky http://intelligence.org/
> > Research Fellow, Singularity Institute for Artificial Intelligence
> >



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT