RE: Threats to the Singularity.

From: Ben Goertzel (ben@goertzel.org)
Date: Sun Jun 23 2002 - 07:46:14 MDT


Hi,

Samantha, you raise a lot of good points in your e-mail and I'll only
address a couple of them due to lack of time today...

> Also, none of us can answer but from where we are right now. If
> the answer is that human beings are seen as expendable, even
> while we are ourselves fully human, then I think that needs to
> be examined and questioned carefully.

It is clear that we humans, generally speaking, view human lives as
expendable. A deep study of human history is not necessary to reveal this!

> > which are very different attitudes. I do not personally hold either
> > attitude, but I can sympathize more with the "indifference" attitude --
> > because, from the grand perspective, one relatively primitive
> intelligent
> > species may not be all that important.
>
> I don't see anything at all "grand" about such a perspective.

By "grand" I mostly meant "very large-scale" rather than "local". No value
judgment was implied.

> > I think the right thing is for the human-race technological vanguard to
> > simultaneously work on building artificial superintelligence, AND on
> > creating better humans beings (genetic engineering, brain augmentation,
> > etc.).
> >
>
> Ethics, morality, better social and economic systems...
>
>
> > And in fact, this is what is happening.
> >
>
> It is not at all clear to me that the non-hardware parts of the
> problem are being addressed much.

I guess that, right now, it is technically oriented folks who *see* the
Singularity coming, so that most of the writing and thinking regarding it is
technical in nature.

As Singularity awareness spreads more broadly, non-tech-focused people will
have more to say about it I'm sure.

There are certainly some very hard problems here. For example, keeping any
kind of freedom and democracy going in a population of same-species
organisms is tough enough. How to keep these qualities alive in a world
full of sentients with radical differences in intelligence and other
qualities, is a problem that may be beyond human intelligence.... I doubt
these problems will be solved prior to the creation of superhuman minds.

> > I think it is possible to achieve some degree of nonattachment from one
> > species, in terms of one's reasoning and one's value system.
> Of course, one
> > can never completely remove inferential and emotional bias from
> oneself --
> > it's not even clear what this would mean!
> >
>
> I hardly see how to build a value system on the well-being of
> that which does not yet exist.

In fact, humanity has so far failed to build anything near a consensus value
system regarding *human well-being*.

I strongly suspect that moral and social thinking is going to continue to
flail around, raising interesting questions and loads of possible answers
but arriving at no definite conclusions. Meanwhile, technological
development will proceed more and more rapidly, dragging the moral/social
aspect along with it.... I am not saying this is necessarily *good*, just
making a prediction.

-- Ben G



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT