Re: Fighting UFAI

From: justin corwin (outlawpoet@gmail.com)
Date: Wed Jul 13 2005 - 15:05:25 MDT


On 7/13/05, Carl Shulman <cshulman@fas.harvard.edu> wrote:
> Faced with the first seed AI, why not... <snip AI-boxing scenario>

AI-boxing is not possible. The procedure you go through is
insignificant. The place you went wrong is not the idea, but the
framework. You can't improve the safety of an AI by trying to contain
it. If it's unfriendly, it either escapes or gives you a gamed FAI
spec that contains it's own specification inside it where you can't
see. Or it's too stupid to do either, and can't give you the spec. Or
it was friendly anyway and your procedure bought you nothing.

-- 
Justin Corwin
outlawpoet@hell.com
http://outlawpoet.blogspot.com
http://www.adaptiveai.com


This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:51 MDT