From: Samantha Atkins (firstname.lastname@example.org)
Date: Mon Sep 09 2002 - 06:32:09 MDT
On Monday 09 September 2002 14:59, Eliezer wrote:
> (This started in a discussion on the Extropians list, but after writing it
> I thought it should be forwarded to SL4.)
> Let's say you're an enlightened, moral, altruistic post-Singularity
> civilization. From your perspective, there are several possible states
> that matter can occupy:
> 1) Matter can be organized into citizens of your own enlightened, moral,
> altruistic post-Singularity civilization.
Hmmm. Are you starting with assumptions that:
a) only matter matters;
b) your own post-Singularity civ should have say over all matter that you can
> 2) Matter can be organized into boring uncomplex things like stars. This
> is rather pointless so you want to transform this matter into state (1).
> I mean, all else being equal, why not? Also, Fred is running out of room
> for his MP3 collection.
Well, stars are "pretty", are interesting energy sources, might have planets
which might have life now or later and so on. Hardly necessarily viewed as
> 3) Matter can be organized into evolving but nonsentient replicators.
> This still isn't exactly a lot of complexity by comparison with (1) and is
> arguably (hey, I think so) pretty much morally the same as (2).
What does "morality" have to do with it? Are you implying a moral duty to
organize all matter as complexly as possible? At what level do you consider
some replicators to be sentient? If they are capable of evolving to
sentience does their moral status change as far as what you, the powerful
SAI, might do or not do with that bunch of matter?
> 4) Matter can be organized into evolved, intelligent, but pre-Singularity
> entities. In this case their quality of life is pretty hideous by
> comparison with (1) so you want to show up and rescue them.
Why would you consider it moral to "rescue" other life to your presumed much
preferable state? Are you sure you can do that without effectively
destroying that life and simply rearranging the matter into entities more to
your liking? Rescue could spell doom if not deeply considered and very
carefully and patiently, if at all, implemented.
> 5) Matter can be organized into a different enlightened post-Singularity
> civilization than your own. In this case you want to show up and say hi -
> exchange information to increase the total nonduplicated complexity of the
> 6) Matter can be organized into shapes that are dangerous and unpleasant
> as the result of Singularities gone wrong. You want to know where these
> infestations are, maybe even do something about them, and you certainly
> want to show up *before* that point if at all possible. In fact, this
> alone provides an adequate rationale for cataloguing all the matter in the
> universe and making sure none of it is developing into hostile
> superintelligence. (Alternatively, you might want to run away into your
> own little hidey-hole universe, in which case you are not visibly
> engineering galaxies or whatever, which is how this conversation got
What determines what "gone wrong" is and is not? What is and is not
"hostile" or potentially so? How do you avoid the "drop a big chunk of rock
on any developing life remotely threatening" scenario?
> I realize we don't know what choices smarter-than-human intelligences
> would make, but the choice to sit and contemplate your own navel seems a
> lot more anthropomorphic than the alternative. Even if you don't want to
> reproduce, or you don't want to reproduce too often, you'll still want to
> grow your mind over time.
What do you mean by "contemplate your own navel" here? There are a tremendous
number of things that can be fruitfully explored without feeling a pressing
need to catalog all matter for possible threats. I would consider making the
latter a most pressing task to be more likely pathological than the
> Now, is there any good reason why an enlightened, moral post-Singularity
> civilization would *not* be out to absorb, say hi to, or rescue all matter
> in the universe?
Yes, assuming morality includes a respect for or appreciation of different
than itself living and non-living, sentient and non-sentient existence. At
the least I believe a moral super-intelligence would understand or find out
very quickly that "rescue" is often not a very good idea. It would probably
find out pretty soon that wiping out or co-opting potential hostiles too
easily puts it on the "possibly hostile" list of some other entity. It also
potentially wipes out possibly future very productive and beneficial
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:41 MDT