RE: Threats to the Singularity.

From: Ben Goertzel (ben@goertzel.org)
Date: Fri Jun 21 2002 - 11:15:37 MDT


Hello Samantha!

> >>Well, in my personal opinion and speaking plainly, disowning
> >>one's species, literally all of humanity include one's self and
> >>loved ones, for an unknown set of hopefully stable higher
> >>intelligences is a very deep form of treason and betrays a
> >>singular contempt for humanity that I find utterly apalling.
> >>
> >
[Ben had written]:
> > But I really feel this is not correct.
> >
> > The attitude you describe does not necessarily imply a *contempt* for
> > humanity.
> >
> > On the contrary, it *could* imply this, but it could also imply a mere
> > *indifference* to humanity.
>
>
> How can or should one be "indifferent" to one's species and to
> the survival of all existing higher sentients on this planet?

Let me be clear on one thing: I was not *advocating* indifference toward
humanity in my post!

I was merely pointing out that indifference to humanity is one possible
motive behind caring more about future superinteligent beings -- contempt
(which you mentioned in the post to which I was replying) being a
*different* motive.

As you know, my best guess is that superhuman AI's will rapidly become
relatively indifferent to humans -- not competing with us for resources
significantly, nor trying to harm us, but mostly being bored with us and
probably helping us out in offhanded ways.

> If one is for increasing intelligence (how one defines that and
> why it is the only or most primary value are good questions) and
> the increase of sentience, I fail to see how one can be cavalier
> about the destruction of all currently known sentients. How can
> one stand for intelligence and yet not care about billions of
> intelligent beings that already exist?

How can one care about life and yet accept the immense murder of ants that
comes along with, say, digging the foundation for a new house?

An advanced superhuman AI may become aware of 1000's of other types of
life-forms or mind-forms that we cannot conceive of now. From its point of
view, then, how critical will we be? From your point of view, as an upload
with 1000x human intelligence and direct contact with these 1000's other
life forms as well, how important will humanity be to "YOU"? Do you pretend
to know the answers to these questions?

> Why would you disown what is of value to you? On the basis of a
> hypothetically better intelligence (along some dimensions of
> "better")? Why would you and how is it justified to also
> casually speak of "it" being more important than not only your
> own values and life but that of all other human beings also?
> All I am pushing for here is that we pause before writing off
> the human race as inconsequential as long as we can get to
> Singularity. To do so writes off the known for something that
> is unknown and unknowable. That does not seem reasonable to me.

I hope you're not meaning to imply that *I* am somehow writing off the human
race as inconsequential. Not at all.

The intention of my post was merely to distinguish between

a) contempt of humanity

b) indifference to humanity

which are very different attitudes. I do not personally hold either
attitude, but I can sympathize more with the "indifference" attitude --
because, from the grand perspective, one relatively primitive intelligent
species may not be all that important.

What attitude do I take?

Personally I try (and occasionally succeed ;) to practice the two Buddhist
virtues of compassion and nonattachment. The combination of these is
tricky to master, as in a shallow sense they may seem to contradict each
other.

In the context of the present discussion, being compassionate toward humans
means that one doesn't want them to suffer, and that one has respect for
humans' right to continue even as more advanced beings come along. And
nonattachment means *simultaneously* with compassion, also understanding
that the human race does not have some kind of intrinsic special value as
compared to other forms of existence, intelligence and life -- it means
moving beyond one's biologically-based attachment to one's own species.

> >>How about we just grow a lot more sane human beings instead of
> >>digging continuously for technological fixes that really aren't
> >>fixes to the too often cussedness of local sentients?

I think the right thing is for the human-race technological vanguard to
simultaneously work on building artificial superintelligence, AND on
creating better humans beings (genetic engineering, brain augmentation,
etc.).

And in fact, this is what is happening.

> >> Replacing
> >>them with something that is faster and arguably smarter but may
> >>or may not be any more wise is not an answer. Scrapping
> >>sentients is to be frowned upon even if you think you can and
> >>even do create sentients that are arguably better along some
> >>parameters.

Yes, our value systems agree on this point.

Like nearly all others on this list, I would like to see a future in which
enhanced humans and superintelligent AI's coexist in harmony and with
mutuall productive interactions.

I do reject the notion that preserving the human race is of *absolutely
primary* importance, but according to my own ethics and aesthetics, it is
certainly *highly* important.

> > "Wisdom" is a nebulous human concept that means different things to
> > different people, and in different cultures.
> >
>
>
> I can tell you what it isn't. It isn't about writing off all
> existing sentients in favor of something you think can be but
> you have no idea what it will become.

I feel like you're attacking a straw man here, because no one on this list
suggested "writing off all existent sentients", did they? I certainly
didn't, far from it.

> Human is the only
> sentient basis we have to reason from and in any event is what
> we ourselves are and we have no choice but to reason from that
> basis.

I think it is possible to achieve some degree of nonattachment from one
species, in terms of one's reasoning and one's value system. Of course, one
can never completely remove inferential and emotional bias from oneself --
it's not even clear what this would mean!

-- Ben G



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT