From: Martin Striz (firstname.lastname@example.org)
Date: Fri Jun 09 2006 - 12:22:15 MDT
On 6/7/06, Robin Lee Powell <email@example.com> wrote:
> > BTW, they will also probably be the most complex thing that they
> > have to deal with and learn about. It is logically impossible to
> > contain a complete internal model of yourself.
> Why do people keep saying that?
That's an interesting gimmick, but a quine has no internal model of
itself. My point was that, just as we often can't predict our own
future actions because we are oblivious to the substrate level action
of our minds, an AI won't be able to simultaneously model of its
substrate level activity, so there will be some lack of information,
and some error.
This archive was generated by hypermail 2.1.5 : Wed Jul 17 2013 - 04:00:56 MDT