Re: How hard a Singularity?

From: Eliezer S. Yudkowsky (
Date: Sun Jun 23 2002 - 03:40:44 MDT

Ben Goertzel wrote:
> The main point at issue in the present thread is the *rate of increase*
> induced by this positive feedback loop, not the existence of the loop
> itself.
> The other point at issue is that the projected positive feedback loop is a
> (reasonable, well-motivated, but still speculative) extrapolation from
> current knowledge, to a domain that is very different from any of the
> experience current knowledge is based on. It is mistaken to attach *too
> much* confidence to projections regarding an unknown domain of experience,
> radically different from previously experienced domains. Of course, we have
> to make the best projections we can and act on them, but this doesn't mean
> we should stop reminding ourselves that our projections are nowhere near
> certain.


If you let your "uncertainty" take the form of "but things might work out
the human way", i.e., at human speed and at a subjective rate that is
comprehensible to humans; if you let your "uncertainty" say that a positive
feedback loop is a speculation *and therefore* the alternative is the
ordinary linear kind of progress we know, then the "uncertainty" is not
rational uncertainty but simply an excuse not to let go of whatever
assumptions or domain models you started out with. Pascal's Wager works the
same way for the same reasons. There is nothing privileged about the
alternative you present as "maybe it will just..."

Maybe the Singularity will *not* be a projected positive feedback loop, but
if so it will be faster, not slower. I am uncertain as you, or more so, but
my uncertainty is a volume centered around the positive feedback loop that
seems less odd for an AI than the strange human way of doing things, not a
volume centered around the human world.

Eliezer S. Yudkowsky                
Research Fellow, Singularity Institute for Artificial Intelligence

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT