Re: META: Dangers of Superintelligence

From: Thomas Buckner (tcbevolver@yahoo.com)
Date: Mon Aug 30 2004 - 04:44:24 MDT


--- Daniel Radetsky <daniel@radray.us> wrote:
>He seems
> to argue that the same goes for the single-shot
> AI-box; that he doesn't need to
> be very smart at all, just smart enough to say
> "I'm still not letting you out."
> However, This misses the point.
>
In thinking about writing a book on a sandboxed
AI, (I may have mentioned this) I decided that a
good escape strategy would be to try to invent a
form of memory that is not recognized as such by
the humans, and smuggle a copy of itself out that
way.
Tom Buckner

                
__________________________________
Do you Yahoo!?
Yahoo! Mail - 50x more storage than other providers!
http://promotions.yahoo.com/new_mail



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:48 MDT