Re: Threats to the Singularity.

From: Eugen Leitl (
Date: Mon Jun 24 2002 - 06:45:11 MDT

On Sun, 23 Jun 2002, Michael Roy Ames wrote:

> This route (financial and political domination) would seem to me to be very
> high-energy and high-risk. There are so many other lower-profile and

I'm not an SI, but as a slightly superhuman-level AI with human
motivations I would go the route of lowest resistance/greatest ROI. I
would copy all over the place and specialize. If I need funds, I'll break
into an ecommerce site, and steal credit card info. I would play the
market. I would recruit human quislings (this is not Nigerian, nor even
scam: I can really make you rich... For a brief while) and compromise any
system I see fit, seizing control of the physical layer. I would raid any
online resource looking for useful information (because it's empirical
info I can't get at easily without making a big blip in the physical
layer). I would use lies, stealth and diversion to be able to sustain
these activities as long as required. You can assume I'll be very good at
what I do, orelse I hardly would be called superhuman.

Then it's bootstrap time, and the need to play nice is past. Wheeee!

> lower-risk options, that I cannot see an SI choosing the 'human
> power-structure' way. Don't believe me? Here's some 'softer' options the
> SI might take:
> 1) Become so incredibly useful, that humans *want* to

Looking useful enough long enough will do the trick. See lies.

> protect/help/facilitate ver continued existence.

Very early in the bootstrap phase humanity ceases to be meaningful
players. How can they hurt a globally distributed system that is
controlling the physical layer? Half a century downstream from where we
> 2) Behave in a Friendly manner and make friends with powerful humans.

There is no such thing as a powerful human from a Power point of view.
> 3) Enlist the support of the populace by becoming a media celebrity :)

Thanks, I'd rather eat the media. And the celebrities, and the rest of it.
> I would seem rather far-fetched to suggest that a Super Intelligent being
> would need to take over the world in order to make significant progress

It doesn't take over the world. It just does what it wants. Taking over
the world and you dying is just a side effect. It's not even malicious.

> in... well, in almost any area? Ben: it just doesn't seem likely at all.

Somebody died and made you a Power?

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT