RE: Threats to the Singularity.

From: Ben Goertzel (
Date: Sun Jun 23 2002 - 21:52:58 MDT

> While I agree that it is very likely any SI would quickly become the most
> powerful entity on this planet, the source of that power will probably not
> come from any existing financial or political base. Why? Because the
> existing infrastructure is so incredibly cumbersome and inefficient and
> slow, there would be no point in 'taking over'. The problems
> facing a newly
> minted SI are unlikely to be bottlenecked on resources, because higher
> intelligence can do an awful lot with very little (relatively speaking of
> course).
> Michael Roy Ames

Again we come down to the issue of "takeoff speed".

If the takeoff speed is super-fast, then sure, as soon as the AI becomes
vastly superhumanly intelligent, it can build itself new hardware via
nanotech, protect itself from attackers via impenetrable shields or via
encoding itself in the particle structure of the cosmos, or whatever...

But if the takeoff speed is just *very* fast, then there will be a period
(perhaps years in length) where

a) humans are still a threat to the AI
b) human infrastructure is still valuable to the AI

even tho the AI is smarter than humans. In this case, it will be worth the
AI's while to achieve financial power, although in time it will surely
outgrow this.

This is just one more reason why takeoff speed is not just a technical
question, it makes a huge difference in the
late-pre-Singularity/early-Singularity scenario...

-- Ben G

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:39 MDT