Re: Think of it as AGI suiciding, not boxing

From: Phillip Huggan (cdnprodigy@yahoo.com)
Date: Sat Feb 18 2006 - 23:13:21 MST


To clarify, there would be no user interface. And nothing but mechanical computer components; no danger of physics tricks. A purely mechanical computer log. Yes I know AI Boxing talk is passe, but can someone quickly point out why this architecture would fail? If the AGI is friendly, we lose out on a great deal of its immediate engineering prowess building only safe product designs it spits out. If it is unfriendly, we are saved. It doesn't matter how smart an entity is if the jail permits no key.

Phillip Huggan <cdnprodigy@yahoo.com> wrote: Can friendliness be implemented by specifying the AGI shut itself down if vis self-coding directly interacts with the universe beyond a very limited set of computer operations? In concert with a relatively inert computer substrate such as a molecular computer, how could it cause harm? We could still benefit from such a limited architecture a great deal.

                
---------------------------------
Brings words and photos together (easily) with
 PhotoMail - it's free and works with Yahoo! Mail.



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:55 MDT