From: Mitchell Porter (you_have_some_sort_of_bug@yahoo.com)
Date: Mon Mar 19 2001 - 14:13:56 MST
Eliezer says occasionally that the Sysop scenario
is just a *guess* as to what a Friendly super-AI
would choose to do. I think people would be more
likely to remember this if there was another
scenario on offer, and in fact my own default
picture of a Friendly Singularity is this other
one, of Universal Uplift.
The basic meaning is presumably clear, although
variations on it are possible (uplift sentients
of Earth; uplift non-sentients as well; uplift
all sentients ever within reach, whether alien
or Earth-evolved). The essential idea: if
everyone is a Friendly superintelligence, who
needs a Sysop? Only the non-sentient, the newly
sentient, and the not yet uplifted.
The obvious criticism of Universal Uplift is
a Borg-like imposition; which is why I think of
it working by enticement rather than imposition.
In effect, the uplifter leaves a trail of
crumbs for the upliftee to follow, and by the
time you reach the end of the trail, you're a
Friendly superintelligence.
__________________________________________________
Do You Yahoo!?
Get email at your own domain with Yahoo! Mail.
http://personal.mail.yahoo.com/
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:36 MDT