Re: Threats to the Singularity.

From: Michael Roy Ames (michaelroyames@hotmail.com)
Date: Sun Jun 23 2002 - 20:27:27 MDT


Ben wrote:
>
> It would probably take years, not months (though months is possible), for
an
> AGI to complete its bid for world power based on financial and political
> operations...
>
> But I do consider it a very likely outcome. And I do think the AGI will
want
> world power, both to maximize its own hardware base, and to prevent nasty
> humans from blowing it up.
>

This route (financial and political domination) would seem to me to be very
high-energy and high-risk. There are so many other lower-profile and
lower-risk options, that I cannot see an SI choosing the 'human
power-structure' way. Don't believe me? Here's some 'softer' options the
SI might take:

1) Become so incredibly useful, that humans *want* to
protect/help/facilitate ver continued existence.

2) Behave in a Friendly manner and make friends with powerful humans.

3) Enlist the support of the populace by becoming a media celebrity :)

I would seem rather far-fetched to suggest that a Super Intelligent being
would need to take over the world in order to make significant progress
in... well, in almost any area? Ben: it just doesn't seem likely at all.

Michael Roy Ames



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT