Re: Threats to the Singularity.

From: Michael Roy Ames (michaelroyames@hotmail.com)
Date: Sun Jun 23 2002 - 22:56:00 MDT


James Higgins wrote:
>
> I disagree. If the goal is to protect yourself from humans then becoming
> exceptionally powerful, on their terms, is a good answer. Especially if
> doing so is a relatively easy task.
>

If the goal of the SI is 'to protect yourself from humans', then we will
have already lost. If the SI is built with, or acquires, an adversarial
attitude toward humans, then we will be toast. My suggested alternatives
(usefulness, friendliness, entertainment-source) were all intended to be
used by a non-adversarial SI, and to have a good chance of gaining
significant cooperation from humans.

While I agree that it is very likely any SI would quickly become the most
powerful entity on this planet, the source of that power will probably not
come from any existing financial or political base. Why? Because the
existing infrastructure is so incredibly cumbersome and inefficient and
slow, there would be no point in 'taking over'. The problems facing a newly
minted SI are unlikely to be bottlenecked on resources, because higher
intelligence can do an awful lot with very little (relatively speaking of
course).

Michael Roy Ames



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT