Re: Building a friendly AI from a "just do what I tell you" AI

Date: Tue Nov 20 2007 - 21:25:22 MST

On 11/21/07, Byrne Hobart - wrote:
> > * It CANNOT modify itself, nor does it want to modify itself.
> I'm not sure how intelligent it is if it can't modify itself. Did you
> mean that it can't modify some of itself -- that, for example, it
> can't decide that 2 + 2 is 5, but it can decide that it probably
> wasn't Colonel Mustard in the Conservatory with the lead pipe? I
> don't see how it can be intelligent without being able to change,

It can modify itself in the sense that it can learn new facts in the
same way that we humans can learn new facts. But it is unable and
doesn't want to fundamentally rewire itself. To use an analogy: a
computer can store new data on it's hard drive, but it cannot rewire
itself. I hope this makes it clear.

This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:01:00 MDT