Static uploading is SL3 (was: the 69 of us)

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Fri Nov 17 2000 - 10:38:06 MST


Spudboy100@aol.com wrote:
>
> This is a question more for the future. But if somebody gets uploaded, and
> stays inside their fantasy world, will not insanity result? Is this why there
> have been no repeated "signals" seti-wise. That civilizations join the Land
> of the Lotus Eaters and forget their primate orgins. If one spends their time
> as a artilectual rhomboid inside cybernetic, dimension -7, and forgets what
> its like to go to a bookstore and have coffee with a friend; won't this just
> serve to set us up for a bad end?

This is more of an SL3 than an SL4 topic; it's the sort of thing that ought to
appear in a FAQ rather than be discussed on the list. It's not on the
frontiers of thought; been there, done that. So unless there is a demand (on
*this* list) for extended discussion of the subject, this single reply should
probably stand as being the abbreviated version of an answer that should be in
the hypothetical FAQ. (Of course, on the Extropians list, you can always get
an extended debate on any topic, no matter how settled. <smile>)

Basically, your question leaves out the entire concept of intelligence
enhancement and self-modification - and you'll note that static uploads are
listed under SL3 on the original "Future Shock" page. A static upload might
go insane or spend a billion years on flower arrangements; who knows? But I
can't see a transhuman upload as vulnerable to insanity, even leaving aside
fine-grained control of simulated neurology or the ability to self-modify. A
transhuman should simply be able to think faster than the forces that are
responsible for insanity in biological humans - see any problems a mile away.
And for uploads-turned-Powers, I can't even see asking the question. Likewise
for uploads in a Sysop Scenario, who can write their own insurance policies.

None of the uploads in _Diaspora_ are transhuman, but even they have the
ability to modify themselves better than that - they can run "outlooks" and so
on. I would refer you to _Diaspora_ for further information.

Furthermore, asking the question with respect to SETI seems illogical; it
would only take one sane civilization or one sane member of one civilization
to send out signals. The Fermi Paradox still holds. As for Diaspora-type
cosmic hammer blows, staying attached to the meat certainly wouldn't help, so
it's clear that this class of scenario only increases the urgency of uploading
- and that's all we need to know, isn't it? And of course we, or at least the
Sysop in the Sysop Scenario, would need to keep an eye on physical reality.
Who ever said otherwise?

-- -- -- -- --
Eliezer S. Yudkowsky http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:35 MDT