RE: Threats to the Singularity.

From: Ben Goertzel (ben@goertzel.org)
Date: Thu Jun 13 2002 - 19:02:34 MDT


Samantha,

You wrote:
> Well, in my personal opinion and speaking plainly, disowning
> one's species, literally all of humanity include one's self and
> loved ones, for an unknown set of hopefully stable higher
> intelligences is a very deep form of treason and betrays a
> singular contempt for humanity that I find utterly apalling.

But I really feel this is not correct.

The attitude you describe does not necessarily imply a *contempt* for
humanity.

On the contrary, it *could* imply this, but it could also imply a mere
*indifference* to humanity.

I confess that I feel somewhat indifferent to humanity, sometimes (not
always!). Sometimes I just think of humanity as a vehicle for intelligent
mind. And, I think: If a better vehicle comes along, why is the
human-vehicle so important? And why are our individual human minds --
including mine -- so important. In the big picture of the evolution of life
and mind in the cosmos, surely they aren't....

Sure, the human race is important to me emotionally... just like my family
is important to me emotionally ... just as my own limbs are important to me
emotionally ... but why are my personal human emotions so important?

Indifference to humanity could come out of nihilism. But it could also come
out of having values that go beyond any particular species, or any
particular vehicle, and focus on general things like intelligence and
creation.

In short, there are MANY different psychological motives that could underly
the attitude Mike displays, contempt being only one of them.

> How about we just grow a lot more sane human beings instead of
> digging continuously for technological fixes that really aren't
> fixes to the too often cussedness of local sentients? Replacing
> them with something that is faster and arguably smarter but may
> or may not be any more wise is not an answer. Scrapping
> sentients is to be frowned upon even if you think you can and
> even do create sentients that are arguably better along some
> parameters.

"Wisdom" is a nebulous human concept that means different things to
different people, and in different cultures.

However, I think it's pretty likely that intelligent software WILL be wiser
than humans, due to reasons Eliezer has pointed out nicely in his writings.
We have an evolutionary heritage that makes it really tough for us to be
wise, and there seems to be no reason why intelligent software would have
any similar problem.

-- Ben G



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT