Re: Building a friendly AI from a "just do what I tell you" AI

From: Byrne Hobart (sometimesfunnyalwaysright@gmail.com)
Date: Tue Nov 20 2007 - 20:34:25 MST


> * It CANNOT modify itself, nor does it want to modify itself.

I'm not sure how intelligent it is if it can't modify itself. Did you
mean that it can't modify some of itself -- that, for example, it
can't decide that 2 + 2 is 5, but it can decide that it probably
wasn't Colonel Mustard in the Conservatory with the lead pipe? I
don't see how it can be intelligent without being able to change,
because that would mean that new information would always be external
to the intelligence itself. That would lock it in the Chinese Room
and throw away the key.



This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:00 MDT