Re: How hard a Singularity?

From: Eugen Leitl (eugen@leitl.org)
Date: Sun Jun 23 2002 - 04:27:51 MDT


On Sat, 22 Jun 2002, Eliezer S. Yudkowsky wrote:

> Because the upload, if she's smart, will not concentrate on working as a
> researcher on some other, ordinary technological project; she will
> concentrate on improving herself. The very first change that upload makes

This assumes 1) she's allowed to 2) the payoff is considerable

Both are questionable assumptions.

> which successfully increases her own intelligence (though it might take much
> more than a month to manage this, for an unprepared upload) will increase

I challenge you to make nontrivial progress in a month if given full low
level access to your current cognition processes (emulated wetware at
molecular level), besides producing a lot of neat neuroscience papers.

> the ease of all successive improvements, which will increase the ease of
> further improvements even further - a runaway positive feedback loop.

You still assume that all positive autofeedback loops are in the same
class. On the short run they're not.



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT