Re: Threats to the Singularity.

From: Gordon Worley (redbird@rbisland.cx)
Date: Mon Jun 17 2002 - 07:45:06 MDT


On Monday, June 17, 2002, at 05:31 AM, Samantha Atkins wrote:

>
>
> Gordon Worley wrote:
>
>> On Saturday, June 15, 2002, at 07:38 PM, Samantha Atkins wrote:
>>> How can or should one be "indifferent" to one's species and to the
>>> survival of all existing higher sentients on this planet? If one is
>>> for increasing intelligence (how one defines that and why it is the
>>> only or most primary value are good questions) and the increase of
>>> sentience, I fail to see how one can be cavalier about the
>>> destruction of all currently known sentients. How can one stand for
>>> intelligence and yet not care about billions of intelligent beings
>>> that already exist?
>> First off, attachment to humanity is a bias that prevents rational
>> thought.
>
>
> Rational? By what measure? How is attachment to the well-being of
> ourselves and all like us irrational?

Eliezer addressed this in his reply to this thread earlier. It is
irrational if the attachment is blind. You must have some reason that
you need to stay alive, otherwise provisions for it will most likely get
in the way of making rational decisions.

> Whether we transform or simply cease to exist seems to me to be a
> perfectly rational thing to be a bit concerned about. Do you see it
> otherwise?

Sure, you should be concerned. I think that the vast majority of
humans, uploaded or not, have something positive to contribute, however
small. It'd be great to see life get even better post Singularity, with
everyone doing new and interesting good things.

>> I and others have broken this attachment to keep it from clouding our
>> thinking.
>
> So you believe that becoming inhuman and uncaring about the fate of
> humanity allows you to think better?

If only it were easy to become inhuman, but it's not.

Uncaring is inaccurate. I do care about humans and would like to see
them upload. I care about any other intelligent life that might be out
there in the universe and helping it upload. I just don't care about
humans so much that I'd give up everything to save humanity (unless that
was the most rational thing to do).

>> It is the result of being genetically related to the rest of humanity,
>> where the death of all human genes is a big enough problem to cause a
>> person to give up a goal or die to save humanity.
>
> I do not necesarily agree that we can just write it off as a genetic
> relatedness issue at all. Whether there is sentient life and whether
> it continues, regardless of its form, is of intense interest to me.
> That some forms are not genetically related is not of high relevance to
> the form of my concern. So please don't assume that explains it away or
> makes the issue go away. It doesn't.

There is an ethical issue, however the irrational attachment is the
result of relatedness. A proper ethic is not so strong that it prevents
you from even thinking about something, the way evolved ethics do.

>> Some of us, myself included, see the creation of SI as important
>> enough to be more important than humanity's continuation. Human
>> beings, being
>
> How do you come to this conclusion? What makes the SI worth more than
> all of humanity? That it can outperform them on some types of
> computation? Is computational complexity and speed the sole measure of
> whether sentient beings have the right to continued existence? Can you
> really give a moral justification or a rational one for this?

In many ways, humans are just over the threshold of intelligence.
Compared to past humans we are pretty smart, but compared to the
estimated potentials for intelligence we are intellectual ants. Despite
our differences, all of us are roughly of equivalent intelligence and
therefore on equal footing when decided whose life is more important.
But, it's not nearly so simple. All of us would probably agree that
given the choice between saving one of two lives, we would choose to
save the person who is most important to the completion of our goals, be
that reproduction, having fun, or creating the Singularity. In the same
light, if a mob is about to come in to destroy the SI just before it
takes off and there is no way to stop them other than killing them, you
have on one hand the life of the SI that is already more intelligent
than the members of the mob and will continue to get more intelligent,
and on the other the life of 100 or so humans. Given such a choice, I
pick the SI.

In my view, more intelligent life has more right to the space it uses
up. Of course, we hope that intelligent life is compassionate and is
willing to share. Actually, I should be more precise. I think that
wiser life has more right to the space it uses (but you can't be wiser
without first being more intelligent). I would choose a world full of
dumb humans trying hard to do some good over an Evil AI.

>> self aware, do present more of an ethical delima than cows if it turns
>> out that you might be forced to sacrifice some of them. I would like
>> to see all of humanity make it into a post Singularity existence and I
>> am willing to help make this a reality.
>
> How kind of you. However, from the above it seems you see them as an
> ethical dilemna greater than that of cows but if your SI, whatever it
> turns out really to be, seems to require or decides the death of one or
> all of them, then you would have to side with the SI.
>
> Do I read you correctly? If I do, then why do you hold this position?
> If I read you correctly then how can you expect the majority of human
> beings, if they really understood you, to consider you as other than a
> monster?

If an SI said it needed to kill a bunch of humans, I would seriously
start questioning its motives. Killing intelligent life is not
something to be taken lightly and done on a whim. However, if we had a
FAI that was really Friendly and it said "Gordon, believe me, the only
way is to kill this person", I would trust in the much wiser SI.

This is the kind of reaction I expect and, while I'm a bit disappointed
to get so much of it on SL4, therefore avoid pointing this view out. I
never go out of my way to say that human life is not the most important
thing to me in the universe, but sometimes it is worth talking about.

--
Gordon Worley                     `When I use a word,' Humpty Dumpty
http://www.rbisland.cx/            said, `it means just what I choose
redbird@rbisland.cx                it to mean--neither more nor less.'
PGP:  0xBBD3B003                                  --Lewis Carroll


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT