Re: An essay I just wrote on the Singularity.

From: Perry E. Metzger (
Date: Wed Dec 31 2003 - 12:23:33 MST

"Ben Goertzel" <> writes:
> 1)
> I don't see any strong reason to believe that strong nanotech is closer in
> time than Friendly AI.

Of course, there is no reason not to believe it either. We have no
idea, frankly, what will show up and in what order, until it happens.

> Taking a dispassionate point of view, surely the scientific jury is still
> out regarding the question of whether strong AI or strong nanotech will
> arrive first.


> 2)
> It's just not true that humans developing strong nanotech will *necessarily*
> lead to destruction.
> Whether strong nanotech, in human hands, leads to destruction or not depends
> on a lot of "details" including scientific and political ones.

Well put.

> So far as preparing for the future goes, the most important thing is to get
> ourselves -- individually and collectively, with whatever human and/or AI
> intelligence is available -- in a mental position that is capable of dealing
> adequately with the advent of wild surprises. This is much more important
> than planning for any particular conjectural contingency.

Yes. I'm straining my neck nodding here.


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:43 MDT