Re: How hard a Singularity?

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Sun Jun 23 2002 - 05:43:46 MDT


Eugen Leitl wrote:
> On Sun, 23 Jun 2002, Eliezer S. Yudkowsky wrote:
>
>>This is not true of a human-level seed AI; how do you think the seed
>>AI got there in the first place?
>
> Of course this is correct, and this is exactly the reason I'm favouring
> brute-force uploads over AIs. Brute-force uploads have real trouble
> figuring our their modus operandi, before being able to enhance it
> (trivial mods like scaling cortex size excluded).

Except that by the time you have real uploads, you also have a tremendous
amount of knowledge that can be used for enhancing uploads. The scenario as
given was an impossibility - a human, uploaded all at once and with no
surrounding scientific advancement, onto hardware which runs that human at
precisely the human subjective rate. It illustrates something, I suppose,
but not much.

If you upload a real human, then there will be more than enough neuroscience
known to allow for a few "trivial mods" that will produce a transhuman, and
from there we're well into hard-takeoff land. Also, I would expect the
first uploads to be either considerably faster or considerably slower than
human. One month of physical time is not one month of subjective time.

Incidentally, you have yet to explain how both computing and physical
technology get all the way to the point where millions of people can be
reliably scanned, uploaded, and run, without one single person producing a
seed AI or one single upload transcending. "Law" does not seem powerful
enough to account for such an enormous technological disparity. Or do you
have any specific laws in mind?

-- 
Eliezer S. Yudkowsky                          http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT