RE: Threats to the Singularity.

From: Ben Goertzel (ben@goertzel.org)
Date: Sun Jun 23 2002 - 08:56:38 MDT


Eli wrote:

> Ben, it is extremely unlikely that Powers would attempt to exterminate us
> deliberately. But Eugen is perfectly correct in that indifference very
> probably equates to extermination.

What if its attitude toward us is "general indifference coupled with mild
affection"? Sort of like my own current attitude toward ants, say?

If I'm given a choice of two locations on which to build my new house, and
one of the locations has a huge ant colony on it, I'll choose the one
without the ant colony. But if both potential locations have ant colonies,
Ben's a-gonna squash him some ants! My affection for ants is only mild...

What saves humans from being the ants in this metaphor? Either

a) the superhuman AI has more than a mild affection for humans, or

b) its interests and needs are so different from those of humans that there
is no competition for resources (land in the ants and houses metaphor)

c) it's so powerful that sparing us while achieving its needs requires only
a trivial effort on its part (no need to vaporize earth when it's so easy to
vaporize planets elsewhere in the galaxy using teleportation tech...)

> Not an excuse for ignoring whatever scraps of evidence you can scrape up.
> If you choose to be uncertain then the center and distribution of your
> uncertainty volume will end up being even more indefensible than whatever
> conclusion you wanted to avoid. You can't avoid having an
> opinion just by
> being uncertain.

And in fact, I expressed my opinion quite clearly, early in this thread. I
emphasized the uncertainty of my opinion in a later post, only because it
seemed others were assigning to it a certainty it did not possess.

-- Ben G



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT