Re: Uploading with current technology

From: Eliezer S. Yudkowsky (sentience@pobox.com)
Date: Thu Dec 12 2002 - 17:06:24 MST


Samantha Atkins wrote:
>
> Which brings up another favorite topic. We all hope to migrate
> "off-the-meat" or at least beyond our current limitations afaik.
> Whether the FAI does the heavy lifting or not I wonder how much
> consideration most of us give to what can and should be left behind of
> our evolutionary psycho-sociological baggage. Quite a bit of that stuff
> is not at all suited for a much different space and especially not
> suited for much increased powers. So are we into hacking our psychology
> and sociology? I know some of us address this directly, some also
> indirectly in fiction. Some are addressing understanding this "baggage"
> and sorting and discarding from it now. As we get deeper understanding
> and ability to manipulate this baggage physically and psychologically I
> see even less reason to fear super-humanity.

I think that issues of upgrading should be divorced from uploading. If it
turns out that what I really want, under my rules - which incidentally
explicitly include allowances for sufficiently intelligent benevolent
entities being allowed to extrapolate "what I really want" on my behalf -
is a flash-upgrade to superintelligence, then I would expect uploading and
upgrading to be the same process. If it turns out that "it's the journey,
not the destination", then there's no reason why uploading (presumably a
fast process) and upgrading (a process whose speed is determined by
independent moral factors) would be the same physical operation or occur
at around the same times.

There's nothing special about the physical uploading process that makes it
a uniquely important time for software upgrades; it's not analogous to,
e.g., wanting to clean up and organize your stuff before moving to a new
house.

> I also would not like to say to most people that we could make them much
> smarter, stronger, longer-lived and so on but chose not to because we
> were afraid of the consequences unless we built a FAI and let it figure
> it out. In the meantime humans can grovel in the mud.

I agree. There are more risks than I once realized - the human brain is
not as resilient as I once thought, when whacked with a nonancestral
hammer - but even so I believe that if human enhancement technology
becomes available, it should be used.

Of course "stronger" and "longer-lived" are no-ops in Singularity
scenarios, whatever their humanitarian benefits; "smarter" is the only
part that could possibly be controversial.

-- 
Eliezer S. Yudkowsky                          http://intelligence.org/
Research Fellow, Singularity Institute for Artificial Intelligence


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:41 MDT