Re: Threats to the Singularity.

From: Samantha Atkins (
Date: Mon Jun 17 2002 - 03:59:58 MDT

Gordon Worley wrote:


> I certainly have compassion for humans. I really do want humans to be
> uploaded.
> At the same time, if it turned out that we couldn't upload, I wouldn't
> say "okay, that's it, no Singularity or SIs" (this is as much a death
> sentence to humans as killing them outright).

Nope. There are other choices. Do the Singularity but leave
earth alone. Guard it. Perhaps police it just enough that some
conflict could not kill all. At the least we would be no worse
off and probably better.

The only way that thre is no way to take full copies of human
beings and recreate them is if MNT is also impossible or there
really is some immaterial something that just doesn't survive
the process. Otherwise a process can be found and is
perfectable enough to work. If so, then take periodic snapshots
of all humans. If they die and they wish it, then bring them
back until they learn better games.

There is much an SI could do to eradicate disease and much
misery. Just the fact that an SI was possible changes the
complexion of human society tremendously. This change could be
molded carefully to produce as much good as possible.

In short, if you make it a categorical imperative to save and
improve the lot of as much of humanity as possible no matter
what, we would all feel a lot better about the Work. It would
also, I believe, be much more likely to come to a good end (or
beginning) for us.

> I don't see uploading humans as the goal of the Singularity, though. I
> see the goal as creating a good successor to humanity; to creating a new
> intelligence that is beyond our own. If we can benefit from that, I say
> all the better (I wouldn't support Friendly AI if I thought that a good
> course of action would be to create an SI with no regard for humans).

I see the goal as, if you will, "a new Heaven and a new Earth".
  I see it as a freeing of sentient life from its constraints
and limitations. If I didn't see it this way I would not be
able to continue or participate. I would not be able to look
human beings in the eye and imagine the world without them or
the world itself gone and the SI in its place. It would crush
me. I would have no joy in my work, nothing to offer those who
dream and have dreamed of better since humanity began. Call me
sentimental if you wish but on this I stand firm.

>> Funny, though, how the longer one lives the more precious
>> life seems to become. They draft 17 year olds because they'll
>> fearlessly charge machine gun nests. I myself at 19 thought
>> that the human race would be better off not existing just
>> because, as you wrote, the "world is full of pain and
>> suffering". (What I failed to see at the time is that
>> it's equally full of joy and gladness, and it's obvious
>> now how evolution had to make it so.)
> There are plenty of good things in life and in humans. We've done
> plenty of good things. I hope that we'll go on to do more good things
> and transcend into a new level of intelligence. At the same time, if we
> don't ever get anywhere, it doesn't matter. If humans lived in caves
> and ate bugs and never did anything new ever, the universe would hardly
> miss them.

We are what we make of ourselves. You, small human, wish
instead to make something totally different that is better than
all of us and makes us irrelevant and unneccessary. At the
least we are the becoming of a SI. I hope it has the wit to
prize that. I hope we have the wit to understand our
interconnection with all It and all sentience and that the
disowning of any sentience is the betraying of it in total.

But it seems to take more than chopping logic to understand this.

- samantha

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT