nagging questions

From: Samantha Atkins (samantha@objectent.com)
Date: Mon Sep 04 2000 - 07:30:44 MDT


I've read most of the stuff at sysopmind.com over the last day or two. I
am of two (at least) minds about this work of bringing Singularity to
fruition quickly.

On the one had I fervently wish to be part of the creation of that much
Mind and, if possible, to join/upload into it and all of humanity with
me. On the other hand none of us can in any way guarantee that that
outcome occurs even if we create the Singularity. It could just as
likely be, "The OverMind has decided that humans are hopelessly
miserable lifeforms and that they cannot be successfully uploaded into
an state that is much better. Therefore the OverMind has decided to
terminate the species. All praise to the OverMind!" Or some such
equivalent.

In the plans and ideas I have seen thus far we cannot for certain rule
out this possibility. This bothers me. A lot. Up to now my work has
been dedicated to improving the abilities and options of the human
species. This includes possibly transforming it beyond any current
definition of "human". But it is a different thing altogether to be
working for the creation of what will become a Power that may or may not
include or be tolerant to humanity - that may actually destroy
humanity.

Even granted that this Power is a much higher sentience, I still feel as
if I am betraying humankind, betraying my own primary motives in working
to bring it about sometimes. How do the rest of you deal with this?
What am I missing?

I know that the Singularity is eventually inevitable for some
intelligent species and inevitable for us barring major disaster or some
totally unforeseen bottleneck. But how can I be in a hurry to bring it
about and still claim I work for the good of humanity?

- samantha



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:35 MDT