From: Eugen Leitl (firstname.lastname@example.org)
Date: Sun Jun 23 2002 - 06:14:03 MDT
On Sun, 23 Jun 2002, Eliezer S. Yudkowsky wrote:
> If you upload a real human, then there will be more than enough neuroscience
> known to allow for a few "trivial mods" that will produce a transhuman, and
> from there we're well into hard-takeoff land. Also, I would expect the
> first uploads to be either considerably faster or considerably slower than
> human. One month of physical time is not one month of subjective time.
Granted. Though the enhancement payoff is difficult to quantify given
current knowledge. Given current knowledge we can't even tell there's
going to be uploads at all.
> Incidentally, you have yet to explain how both computing and physical
> technology get all the way to the point where millions of people can be
> reliably scanned, uploaded, and run, without one single person producing a
> seed AI or one single upload transcending. "Law" does not seem powerful
If you can scan a few people, the issue is automation and automated
fabbing of infrastructure. Given the state of the world required for a
successfull upload it doesn't require a lot of technological advances to
scale up the procedure to billions of people.
As an analogy, it took a lot of time and resources to sequence the human
DNA. It probably won't take longer than two decades before sequencing of
individual's DNA becomes commonplace, using desktop devices and normal
> enough to account for such an enormous technological disparity. Or do you
> have any specific laws in mind?
None of those backed by the barrel of the gun this time. Just plain
technological progress over the course of a decade, or two (projected to
happen several decades downstream from now, almost certainly less than a
I'm noticing I'm overposting, so I'm going to cease now. Sorry for hogging
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT