RE: How hard a Singularity?

From: Ben Goertzel (ben@goertzel.org)
Date: Sat Jun 22 2002 - 20:35:58 MDT


> Because the upload, if she's smart, will not concentrate on working as a
> researcher on some other, ordinary technological project; she will
> concentrate on improving herself. The very first change that
> upload makes
> which successfully increases her own intelligence (though it
> might take much
> more than a month to manage this, for an unprepared upload) will increase
> the ease of all successive improvements, which will increase the ease of
> further improvements even further - a runaway positive feedback loop.
>
> --
> Eliezer S. Yudkowsky http://intelligence.org/
> Research Fellow, Singularity Institute for Artificial Intelligence

Eliezer,

I think that everyone -- or *almost* everyone -- on this list understands
this projected positive feedback loop.

The main point at issue in the present thread is the *rate of increase*
induced by this positive feedback loop, not the existence of the loop
itself.

The other point at issue is that the projected positive feedback loop is a
(reasonable, well-motivated, but still speculative) extrapolation from
current knowledge, to a domain that is very different from any of the
experience current knowledge is based on. It is mistaken to attach *too
much* confidence to projections regarding an unknown domain of experience,
radically different from previously experienced domains. Of course, we have
to make the best projections we can and act on them, but this doesn't mean
we should stop reminding ourselves that our projections are nowhere near
certain.

-- Ben G



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT