Re: Post-Singularity Trade (was: Sysops, volition, and opting out)

From: Eliezer S. Yudkowsky (
Date: Wed Aug 15 2001 - 12:20:33 MDT

"J. R. Molloy" wrote:
> Let me explain. You see, George W. Bush has an IQ of around 90.

Urban legend.

> A super powerful intelligence would be
> smart enough to understand that if it stepped out of line... Click! It goes
> offline, and the next super powerful intelligence is taken off the shelf. (If
> you can make one SI, you can make a million of 'em.)

This is sheer silliness and I've said so enough times already.

> An ultra-intelligent machine will be the last thing humans will ever need to
> invent. How do we keep it docile? An ultra-intelligent machine will be able to
> solve that problem easily. Just ask it!

No ultra-intelligent machine would. Even with Friendly AI this is a
structurally unsolvable problem. A Friendly AI may know how to keep
verself Friendly, but there is a long difference between that and telling
humans - who are not themselves all that reliably altruistic - how to use
the Friendly AI as a tool. Not even a Friendly superintelligence will
lock verself in a black box, or tell you how to keep ver confined.

-- -- -- -- --
Eliezer S. Yudkowsky
Research Fellow, Singularity Institute for Artificial Intelligence

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:37 MDT