RE: Re[6]: continuity of self [ was META: Sept. 11 and Singularity]

From: Ben Goertzel (
Date: Tue Sep 17 2002 - 06:56:53 MDT

Chris Rae wrote:
>In case you
> haven't noticed,
> outside of the 'free' world, life is defined by hardship, misery
> and suffering.

This is a side point for sure, but I can't help taking the bait this early
New Mexico morning...

Chris, have you traveled much outside the Western world?

I have, and based on what I've seen, your statement is a bit of an

There are pockets of really terrible adversity now -- say, war-torn sections
of the Congo, Ethiopia, etc. [Although reportedly, the pygmies in the Congo
are OK, having retreated deep into the forest.]

But from what I saw in (for example) China and Indonesia, there are a lot of
people enjoying themselves reasonably well in spite of having a lot less
money than us. Not that they wouldn't prefer more personal and political
freedom, or more possessions -- hey, so would I! -- but.... Ditto when I
traveled thru Bulgaria when it was still communist.

Of course I've also seen plenty of bad poverty, most notably in India (which
is part of the "free world" in the sense that it has a democratic government
and is a US ally). But although life for the beggars in Bombay looked
awfully bad, life for the poor in the Indian *villages* looked highly
acceptable to me....

[As a a side point, it does not seem to be true that people are more
"well-off" statistically in Third World countries with democratic,
US-friendly governments. There are plenty of poor "free" countries where
the people are worse off than in China, for example.]

Finally, having known a certain number of wealthy people here in the US, I
can also affirm that the absence of hardship does not imply the absence of
misery and suffering! I'm sure the percentage of misery and suffering is
less among wealthy people than among poor people, but, perhaps not *as much*
less as you're suspecting.

I do feel the world political system is sick, in the sense that the world is
now generating enough wealth that physically painful impoverishment (as with
Indian beggars, or poor folk in wartorn Congo) "shouldn't" exist. I just
don't think it's quite as badly sick as your comment implies.

I agree that the reason for this sickness is the operation of the human
brain.... This is pretty damn clear with the political situation in so many
African countries, where endless battles between rival warlords are
destroying lives and nations. Look at and run thru the
list of central African nations and see how many have war-based travel
advisories [a source of vexation to me since I really want to take a trip
into the Central African Republic jungle one of these years...].

> Lets say -hypothetically of course - I hand you (Aaron) a true AI entity
> that is able to improve its own processing capability. What
> exactly are you
> going to do with it? Sure, it can provide you with everything you ever
> dreamed of, but what about the other 6 billion people on this
> planet? What
> is their fate? Are you going to allow these people to choose their own
> destiny, or are you going to force your version of what's best on them?
> For example, when a human rights group used FORCE to stop an
> ancient ritual
> that involved the mutilation of female genitalia as a right of
> passage, it
> was the women who fought back claiming it was their right and no one had
> the authority to take that away, even though the human rights group was
> acting in the best interest of the women.
> My point is, if/when singularity occurs, it can't be forced on
> the general
> population, they must willingly accept the technology. IMO, creating the
> technology necessary to achieve singularity is only half the battle, the
> other half is convincing the people it is going to improve their
> lives. The
> true goal of the singularity is in fact to improve peoples lives,
> isn't it?

The true goal of the Singularity is NOT merely to improve human lives.

This is ONE goal. The advancement of mind in (this part of) the universe to
a new level is also a valid and valuable goal, quite separately from the
improvement of human lives.

If NO human lives were improved at all by the Singularity, but a tremendous
new form of intelligence were created, I'd consider the Singularity very

Another interesting point is that if human lives are improved too much, they
will cease to be *human* lives. What is it to be human? Isn't our
"humanity" tied up with the various weaknesses of our brain, which cause us
to experience human unhappiness, human happiness, etc.? If the "irrational"
pain and suffering on being dumped by a lover is vanquished by
neuroengineering, then have we not become less human? [I don't say that
becoming less "human" is always bad, by the way.]

Finally, it could easily be argued that the job of convincing people the
Singularity is good will be much easier once tech has advanced another 20
years. A direct *demonstration* of the superiority of tech-enhanced ways of
life will go a lot further than any arguments we can pose. So there's a
good argument for keeping the evangelism low-key until we truly have tech to
back it up.

The best argument AGAINST this, so far as I can see, is that a more positive
general attitude toward the Singularity could lead to

a) more R&D funding [though industry seems to be doing OK on its own,
generally speaking, clearly things could do better -- the world governments
could start a few Manhattan Projects for AGI, for example, resulting in
Novamente, SIAI and A2I2 and so forth suddenly getting well-funded]

b) fewer Luddites, ergo less chance of Singularity researchers getting
detonated when S-Day gets nearer...

For these reasons, evangelism now may be worthwhile.

-- Ben G

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:41 MDT